Mar 08 00:05:44 crc systemd[1]: Starting Kubernetes Kubelet... Mar 08 00:05:44 crc restorecon[4697]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:44 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 00:05:45 crc restorecon[4697]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 00:05:46 crc kubenswrapper[4713]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.288107 4713 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293873 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293888 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293893 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293897 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293902 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293906 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293944 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293949 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293954 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293958 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293962 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293967 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293972 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293977 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293981 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293985 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293989 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293993 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.293997 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294001 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294004 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294008 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294012 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294015 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294019 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294023 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294026 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294030 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294033 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294051 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294054 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294059 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294062 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294067 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294070 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294075 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294080 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294084 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294087 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294090 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294093 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294098 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294102 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294106 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294110 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294114 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294117 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294122 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294126 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294130 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294133 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294137 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294141 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294145 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294148 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294152 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294155 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294160 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294163 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294167 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294172 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294176 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294180 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294184 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294188 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294193 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294197 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294201 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294206 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294211 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.294214 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294298 4713 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294307 4713 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294315 4713 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294321 4713 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294327 4713 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294331 4713 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294337 4713 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294343 4713 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294347 4713 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294351 4713 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294356 4713 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294361 4713 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294366 4713 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294371 4713 flags.go:64] FLAG: --cgroup-root="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294375 4713 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294380 4713 flags.go:64] FLAG: --client-ca-file="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294385 4713 flags.go:64] FLAG: --cloud-config="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294389 4713 flags.go:64] FLAG: --cloud-provider="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294394 4713 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294401 4713 flags.go:64] FLAG: --cluster-domain="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294406 4713 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294411 4713 flags.go:64] FLAG: --config-dir="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294415 4713 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294420 4713 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294426 4713 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294431 4713 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294436 4713 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294441 4713 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294446 4713 flags.go:64] FLAG: --contention-profiling="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294450 4713 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294454 4713 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294459 4713 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294464 4713 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294470 4713 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294475 4713 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294480 4713 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294486 4713 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294490 4713 flags.go:64] FLAG: --enable-server="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294494 4713 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294501 4713 flags.go:64] FLAG: --event-burst="100" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294506 4713 flags.go:64] FLAG: --event-qps="50" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294510 4713 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294515 4713 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294520 4713 flags.go:64] FLAG: --eviction-hard="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294525 4713 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294529 4713 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294533 4713 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294538 4713 flags.go:64] FLAG: --eviction-soft="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294542 4713 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294547 4713 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294551 4713 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294555 4713 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294559 4713 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294564 4713 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294568 4713 flags.go:64] FLAG: --feature-gates="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294573 4713 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294578 4713 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294582 4713 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294588 4713 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294593 4713 flags.go:64] FLAG: --healthz-port="10248" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294598 4713 flags.go:64] FLAG: --help="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294602 4713 flags.go:64] FLAG: --hostname-override="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294607 4713 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294612 4713 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294617 4713 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294621 4713 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294625 4713 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294629 4713 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294634 4713 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294638 4713 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294642 4713 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294647 4713 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294651 4713 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294655 4713 flags.go:64] FLAG: --kube-reserved="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294659 4713 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294663 4713 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294667 4713 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294671 4713 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294675 4713 flags.go:64] FLAG: --lock-file="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294680 4713 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294684 4713 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294688 4713 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294700 4713 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294704 4713 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294708 4713 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294712 4713 flags.go:64] FLAG: --logging-format="text" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294716 4713 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294721 4713 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294726 4713 flags.go:64] FLAG: --manifest-url="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294730 4713 flags.go:64] FLAG: --manifest-url-header="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294736 4713 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294740 4713 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294746 4713 flags.go:64] FLAG: --max-pods="110" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294750 4713 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294754 4713 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294758 4713 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294763 4713 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294768 4713 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294773 4713 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294778 4713 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294788 4713 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294793 4713 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294797 4713 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294802 4713 flags.go:64] FLAG: --pod-cidr="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294806 4713 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294814 4713 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294838 4713 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294843 4713 flags.go:64] FLAG: --pods-per-core="0" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294848 4713 flags.go:64] FLAG: --port="10250" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294852 4713 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294856 4713 flags.go:64] FLAG: --provider-id="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294860 4713 flags.go:64] FLAG: --qos-reserved="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294865 4713 flags.go:64] FLAG: --read-only-port="10255" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294869 4713 flags.go:64] FLAG: --register-node="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294874 4713 flags.go:64] FLAG: --register-schedulable="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294878 4713 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294886 4713 flags.go:64] FLAG: --registry-burst="10" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294891 4713 flags.go:64] FLAG: --registry-qps="5" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294895 4713 flags.go:64] FLAG: --reserved-cpus="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294899 4713 flags.go:64] FLAG: --reserved-memory="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294905 4713 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294910 4713 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294915 4713 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294919 4713 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294924 4713 flags.go:64] FLAG: --runonce="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294928 4713 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294933 4713 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294937 4713 flags.go:64] FLAG: --seccomp-default="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294941 4713 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294946 4713 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294950 4713 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294955 4713 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294960 4713 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294964 4713 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294968 4713 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294973 4713 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294977 4713 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294982 4713 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294988 4713 flags.go:64] FLAG: --system-cgroups="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.294993 4713 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295000 4713 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295005 4713 flags.go:64] FLAG: --tls-cert-file="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295010 4713 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295016 4713 flags.go:64] FLAG: --tls-min-version="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295020 4713 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295025 4713 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295030 4713 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295035 4713 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295039 4713 flags.go:64] FLAG: --v="2" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295046 4713 flags.go:64] FLAG: --version="false" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295052 4713 flags.go:64] FLAG: --vmodule="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295061 4713 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295068 4713 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295170 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295175 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295179 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295184 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295188 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295193 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295198 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295202 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295206 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295210 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295215 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295220 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295225 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295229 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295233 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295236 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295240 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295243 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295247 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295250 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295255 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295258 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295262 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295265 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295269 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295273 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295277 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295280 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295284 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295288 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295292 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295295 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295299 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295303 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295307 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295311 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295314 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295318 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295321 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295325 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295329 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295332 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295336 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295340 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295344 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295348 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295352 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295355 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295359 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295363 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295367 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295371 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295374 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295378 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295382 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295385 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295389 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295394 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295400 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295404 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295409 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295412 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295417 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295421 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295425 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295428 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295432 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295436 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295440 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295444 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.295449 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.295462 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.303381 4713 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.303428 4713 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303573 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303585 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303593 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303601 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303609 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303615 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303620 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303626 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303632 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303636 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303642 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303647 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303652 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303657 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303662 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303666 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303671 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303675 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303680 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303685 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303689 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303693 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303698 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303703 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303708 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303717 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303723 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303728 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303735 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303741 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303746 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303753 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303759 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303763 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303768 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303773 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303780 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303787 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303793 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303798 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303802 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303806 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303811 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303815 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303819 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303841 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303846 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303851 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303856 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303860 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303865 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303869 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303873 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303878 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303882 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303887 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303892 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303896 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303901 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303905 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303910 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303923 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303928 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303938 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303943 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303949 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303954 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303959 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303965 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303969 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.303975 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.303985 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304171 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304184 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304189 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304195 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304200 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304208 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304216 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304224 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304231 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304237 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304243 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304248 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304254 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304260 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304265 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304270 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304275 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304280 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304285 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304290 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304295 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304300 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304305 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304311 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304318 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304325 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304331 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304335 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304342 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304347 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304352 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304357 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304362 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304367 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304373 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304380 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304386 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304391 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304397 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304402 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304408 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304413 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304419 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304423 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304429 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304435 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304441 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304446 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304451 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304456 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304461 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304466 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304471 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304477 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304482 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304490 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304496 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304502 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304507 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304513 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304520 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304527 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304532 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304538 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304543 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304548 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304554 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304559 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304564 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304569 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.304574 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.304583 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.304872 4713 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.310359 4713 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.313603 4713 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.313718 4713 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.315534 4713 server.go:997] "Starting client certificate rotation" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.315561 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.315774 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.342715 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.345447 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.346054 4713 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.375010 4713 log.go:25] "Validated CRI v1 runtime API" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.406537 4713 log.go:25] "Validated CRI v1 image API" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.408805 4713 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.415281 4713 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-08-00-00-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.415418 4713 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.445469 4713 manager.go:217] Machine: {Timestamp:2026-03-08 00:05:46.44301786 +0000 UTC m=+0.562650163 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2aa69308-6450-4bec-8579-2da85b0e580a BootID:e399c248-6394-463b-9421-3cdd5fff0be8 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0d:8d:f3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0d:8d:f3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1b:d3:3e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e6:b5:5b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b8:2a:8f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:58:30:f9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:af:02:a9:bc:be Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0a:9f:c1:92:7e:56 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.445969 4713 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.446220 4713 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.448555 4713 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.449063 4713 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.449132 4713 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.449527 4713 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.449562 4713 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.450125 4713 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.450186 4713 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.450666 4713 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.451372 4713 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.455360 4713 kubelet.go:418] "Attempting to sync node with API server" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.455398 4713 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.455426 4713 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.455455 4713 kubelet.go:324] "Adding apiserver pod source" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.455475 4713 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.459925 4713 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.461047 4713 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.462513 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.462612 4713 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.462521 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.462712 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.462776 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464771 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464815 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464861 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464878 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464902 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464931 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464945 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464980 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.464998 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.465014 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.465048 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.465063 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.465979 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.466812 4713 server.go:1280] "Started kubelet" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.467440 4713 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 00:05:46 crc systemd[1]: Started Kubernetes Kubelet. Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.468282 4713 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.474735 4713 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.474346 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.477430 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.477544 4713 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.477814 4713 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.477972 4713 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.482000 4713 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.482864 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.483078 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.483076 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.483198 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.483314 4713 server.go:460] "Adding debug handlers to kubelet server" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.483783 4713 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.483857 4713 factory.go:55] Registering systemd factory Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.483882 4713 factory.go:221] Registration of the systemd container factory successfully Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.484300 4713 factory.go:153] Registering CRI-O factory Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.484335 4713 factory.go:221] Registration of the crio container factory successfully Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.484365 4713 factory.go:103] Registering Raw factory Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.484384 4713 manager.go:1196] Started watching for new ooms in manager Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.483315 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.485672 4713 manager.go:319] Starting recovery of all containers Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.495802 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496128 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496193 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496254 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496329 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496388 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496444 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496498 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496555 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496620 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.496676 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.497989 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498146 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498239 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498317 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498410 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498491 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498579 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498660 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498735 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.498963 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499061 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499157 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499246 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499342 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499443 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499539 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499624 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499721 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.499910 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500005 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500091 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500164 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500253 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500405 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500505 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500603 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500785 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500922 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.500979 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501014 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501042 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501076 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501100 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501137 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501170 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501207 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501242 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501271 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501300 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501327 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501358 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501420 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501473 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501509 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.501552 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503724 4713 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503813 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503874 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503898 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503934 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503955 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503973 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.503995 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504013 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504033 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504050 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504066 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504144 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504164 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504182 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504203 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504267 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504288 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504305 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504321 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504344 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504359 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504379 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504396 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504560 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504606 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504626 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504649 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504669 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504691 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504710 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504725 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504745 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504766 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504785 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504803 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504820 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504876 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504898 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504924 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504946 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504962 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504982 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.504998 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505021 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505056 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505073 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505096 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505116 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505153 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505180 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505263 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505346 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505405 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505444 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505503 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505567 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505618 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505678 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505748 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505808 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505948 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.505987 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.506759 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507023 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507059 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507086 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507112 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507134 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507160 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507186 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507210 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507233 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507257 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507281 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507305 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507328 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507351 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507373 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507395 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507418 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507444 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507475 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507506 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507536 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507573 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507602 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507632 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507660 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507692 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507726 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507760 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507790 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507820 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507881 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507909 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507945 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507968 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.507991 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508013 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508035 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508057 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508080 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508101 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508129 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508159 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508188 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508218 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508248 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508275 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508297 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508320 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508344 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508366 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508391 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508412 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508435 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508457 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508481 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508504 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508527 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508549 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508571 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508600 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508622 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508645 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508670 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508694 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508718 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508741 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508763 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508787 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508810 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508870 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508903 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508926 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508949 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508971 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.508993 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509016 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509044 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509070 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509092 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509116 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509141 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509163 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509184 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509207 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509229 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509251 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509275 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509298 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509321 4713 reconstruct.go:97] "Volume reconstruction finished" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.509338 4713 reconciler.go:26] "Reconciler: start to sync state" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.512939 4713 manager.go:324] Recovery completed Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.525461 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.532329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.532381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.532390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.535257 4713 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.535275 4713 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.535301 4713 state_mem.go:36] "Initialized new in-memory state store" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.537022 4713 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.539605 4713 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.539655 4713 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.539690 4713 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.539758 4713 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 00:05:46 crc kubenswrapper[4713]: W0308 00:05:46.540790 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.540910 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.555477 4713 policy_none.go:49] "None policy: Start" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.557281 4713 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.557311 4713 state_mem.go:35] "Initializing new in-memory state store" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.583232 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.606900 4713 manager.go:334] "Starting Device Plugin manager" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.606961 4713 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.606979 4713 server.go:79] "Starting device plugin registration server" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.607580 4713 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.607605 4713 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.608324 4713 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.608896 4713 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.608927 4713 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.616486 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.640833 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.640943 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.642982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643155 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643526 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643603 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.643917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.644010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.644100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.644450 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.644575 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.644636 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645860 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645896 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.645864 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.646000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.646009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.646094 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.646213 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.646248 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647327 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647571 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647751 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.647813 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648350 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648894 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.648921 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.649500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.649542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.649557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.650049 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.650098 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.650118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.683567 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.708450 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.709930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.710004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.710025 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.710077 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.711745 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712525 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712600 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712782 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.712885 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713074 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713132 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713191 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713243 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713288 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713542 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.713798 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815448 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815507 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815528 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815545 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815578 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815593 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815640 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815657 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815663 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815743 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815847 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815880 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815914 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815760 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815959 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.815994 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816002 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816012 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816032 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816050 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.816222 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.912418 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.915633 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.915682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.915696 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.915725 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:46 crc kubenswrapper[4713]: E0308 00:05:46.916377 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.975309 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 00:05:46 crc kubenswrapper[4713]: I0308 00:05:46.989222 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.015200 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.020968 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.025269 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.040795 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b8705c4371811024393358558a583e9da381d980e998fe03537a4dcbb6e1ad3e WatchSource:0}: Error finding container b8705c4371811024393358558a583e9da381d980e998fe03537a4dcbb6e1ad3e: Status 404 returned error can't find the container with id b8705c4371811024393358558a583e9da381d980e998fe03537a4dcbb6e1ad3e Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.041981 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4c193dbaf164c3e81c6dbefd181d7d05eb996b189663b9bc1e2334f218955727 WatchSource:0}: Error finding container 4c193dbaf164c3e81c6dbefd181d7d05eb996b189663b9bc1e2334f218955727: Status 404 returned error can't find the container with id 4c193dbaf164c3e81c6dbefd181d7d05eb996b189663b9bc1e2334f218955727 Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.047498 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e58514de85c6535e3cc33a4acd1587e3b18c29dad932fde0dde2f9a7b8fd1e6d WatchSource:0}: Error finding container e58514de85c6535e3cc33a4acd1587e3b18c29dad932fde0dde2f9a7b8fd1e6d: Status 404 returned error can't find the container with id e58514de85c6535e3cc33a4acd1587e3b18c29dad932fde0dde2f9a7b8fd1e6d Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.054269 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ef3ee215fcf62737183131e223e0f17021c2c6b1acbe0fa134a8ae11c761d957 WatchSource:0}: Error finding container ef3ee215fcf62737183131e223e0f17021c2c6b1acbe0fa134a8ae11c761d957: Status 404 returned error can't find the container with id ef3ee215fcf62737183131e223e0f17021c2c6b1acbe0fa134a8ae11c761d957 Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.085231 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.317008 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.318349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.318386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.318397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.318422 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.318855 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.477321 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.544268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e58514de85c6535e3cc33a4acd1587e3b18c29dad932fde0dde2f9a7b8fd1e6d"} Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.545260 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c193dbaf164c3e81c6dbefd181d7d05eb996b189663b9bc1e2334f218955727"} Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.546410 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b8705c4371811024393358558a583e9da381d980e998fe03537a4dcbb6e1ad3e"} Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.547275 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ef3ee215fcf62737183131e223e0f17021c2c6b1acbe0fa134a8ae11c761d957"} Mar 08 00:05:47 crc kubenswrapper[4713]: I0308 00:05:47.548229 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f11c5e7e5f21fadd343f33ce48f0110c72e6c7fa4f8c6840db3ee282646b05e"} Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.664064 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.664444 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.704674 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.704781 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.744727 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.744886 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:47 crc kubenswrapper[4713]: W0308 00:05:47.787014 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.787121 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:47 crc kubenswrapper[4713]: E0308 00:05:47.886579 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.119873 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.123576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.123648 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.123668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.123707 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:48 crc kubenswrapper[4713]: E0308 00:05:48.124399 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.354336 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:05:48 crc kubenswrapper[4713]: E0308 00:05:48.355966 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.477314 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.553046 4713 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a" exitCode=0 Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.553133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.553217 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.554283 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.554336 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.554349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.556953 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.556994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.557008 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.557020 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.557139 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.558484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.558509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.558519 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.560414 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510" exitCode=0 Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.560492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.560526 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.561495 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.561566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.561595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562081 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2" exitCode=0 Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562159 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562180 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562918 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562939 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.562950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.564133 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.564160 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675"} Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.564171 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.564024 4713 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675" exitCode=0 Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568775 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.568802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:48 crc kubenswrapper[4713]: E0308 00:05:48.885782 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:05:48 crc kubenswrapper[4713]: I0308 00:05:48.962794 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:49 crc kubenswrapper[4713]: W0308 00:05:49.412363 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:49 crc kubenswrapper[4713]: E0308 00:05:49.412461 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.477669 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 08 00:05:49 crc kubenswrapper[4713]: E0308 00:05:49.487334 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1041e91f6569d3bf51ad7e2b80e1929b032d12c87a67f43b4f33630eed18035c"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569346 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569366 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569379 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.569347 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570545 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9" exitCode=0 Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.570761 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.571528 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.571555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.571568 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.572529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.572551 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.573172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.573201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.573216 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.578408 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.578447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.578460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20"} Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.578464 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.578516 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.579682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.579709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.579722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.580238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.580262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.580275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.725364 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.726630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.726669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.726686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:49 crc kubenswrapper[4713]: I0308 00:05:49.726716 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:49 crc kubenswrapper[4713]: E0308 00:05:49.727169 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587269 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591"} Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587316 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587201 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591" exitCode=0 Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587618 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587723 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587764 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.587618 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.588173 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.588079 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.588945 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.589019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.589047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590497 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590518 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590538 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.590521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.591484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.591509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:50 crc kubenswrapper[4713]: I0308 00:05:50.591527 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.005035 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.015268 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.365952 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.593962 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5"} Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.594060 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c"} Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.594142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90"} Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.594168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7"} Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.593983 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.593983 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597326 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.597342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.963556 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Mar 08 00:05:51 crc kubenswrapper[4713]: I0308 00:05:51.963672 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.460654 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.604796 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857"} Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.604902 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.604862 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606531 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.606592 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.927669 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.929388 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.929460 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.929479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:52 crc kubenswrapper[4713]: I0308 00:05:52.929520 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.222161 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.222428 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.222485 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.224256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.224317 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.224342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.552179 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.608365 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.608459 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610137 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:53 crc kubenswrapper[4713]: I0308 00:05:53.610267 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:54 crc kubenswrapper[4713]: I0308 00:05:54.005092 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:05:54 crc kubenswrapper[4713]: I0308 00:05:54.611011 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:54 crc kubenswrapper[4713]: I0308 00:05:54.612016 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:54 crc kubenswrapper[4713]: I0308 00:05:54.612068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:54 crc kubenswrapper[4713]: I0308 00:05:54.612085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:55 crc kubenswrapper[4713]: I0308 00:05:55.669542 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 08 00:05:55 crc kubenswrapper[4713]: I0308 00:05:55.669906 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:55 crc kubenswrapper[4713]: I0308 00:05:55.672017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:55 crc kubenswrapper[4713]: I0308 00:05:55.672077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:55 crc kubenswrapper[4713]: I0308 00:05:55.672095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:56 crc kubenswrapper[4713]: I0308 00:05:56.520584 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:05:56 crc kubenswrapper[4713]: I0308 00:05:56.520773 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:56 crc kubenswrapper[4713]: I0308 00:05:56.522184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:56 crc kubenswrapper[4713]: I0308 00:05:56.522223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:56 crc kubenswrapper[4713]: I0308 00:05:56.522236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:05:56 crc kubenswrapper[4713]: E0308 00:05:56.616631 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:05:59 crc kubenswrapper[4713]: I0308 00:05:59.775204 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 08 00:05:59 crc kubenswrapper[4713]: I0308 00:05:59.775416 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:05:59 crc kubenswrapper[4713]: I0308 00:05:59.777020 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:05:59 crc kubenswrapper[4713]: I0308 00:05:59.777069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:05:59 crc kubenswrapper[4713]: I0308 00:05:59.777081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:00 crc kubenswrapper[4713]: W0308 00:06:00.156183 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.156332 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:00 crc kubenswrapper[4713]: W0308 00:06:00.157424 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.157521 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.162199 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.162744 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.162906 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.165516 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.166966 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: W0308 00:06:00.167026 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.167113 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:00 crc kubenswrapper[4713]: W0308 00:06:00.168632 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.168683 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.169036 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.170152 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.170208 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 08 00:06:00 crc kubenswrapper[4713]: E0308 00:06:00.171554 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.479117 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:00Z is after 2026-02-23T05:33:13Z Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.626098 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.628143 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1041e91f6569d3bf51ad7e2b80e1929b032d12c87a67f43b4f33630eed18035c" exitCode=255 Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.628216 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1041e91f6569d3bf51ad7e2b80e1929b032d12c87a67f43b4f33630eed18035c"} Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.628441 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.629505 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.629536 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.629546 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:00 crc kubenswrapper[4713]: I0308 00:06:00.630065 4713 scope.go:117] "RemoveContainer" containerID="1041e91f6569d3bf51ad7e2b80e1929b032d12c87a67f43b4f33630eed18035c" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.370519 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.370665 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.371634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.371670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.371680 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.479623 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:01Z is after 2026-02-23T05:33:13Z Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.632405 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.634399 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33"} Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.634582 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.635339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.635376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.635386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.964159 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:01 crc kubenswrapper[4713]: I0308 00:06:01.964267 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.482088 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:02Z is after 2026-02-23T05:33:13Z Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.641186 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.642065 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.644924 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" exitCode=255 Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.644993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33"} Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.645110 4713 scope.go:117] "RemoveContainer" containerID="1041e91f6569d3bf51ad7e2b80e1929b032d12c87a67f43b4f33630eed18035c" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.645316 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.646811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.646895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.646980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:02 crc kubenswrapper[4713]: I0308 00:06:02.648048 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:02 crc kubenswrapper[4713]: E0308 00:06:02.648402 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.480738 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:03Z is after 2026-02-23T05:33:13Z Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.552916 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:03 crc kubenswrapper[4713]: W0308 00:06:03.640878 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:03Z is after 2026-02-23T05:33:13Z Mar 08 00:06:03 crc kubenswrapper[4713]: E0308 00:06:03.640942 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.648392 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.650133 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.650878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.650915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.650927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:03 crc kubenswrapper[4713]: I0308 00:06:03.651438 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:03 crc kubenswrapper[4713]: E0308 00:06:03.651591 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.011934 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.290890 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.479947 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:04Z is after 2026-02-23T05:33:13Z Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.652594 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.653598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.653639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.653655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.654279 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:04 crc kubenswrapper[4713]: E0308 00:06:04.654496 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:04 crc kubenswrapper[4713]: I0308 00:06:04.657550 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:05 crc kubenswrapper[4713]: W0308 00:06:05.438583 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:05Z is after 2026-02-23T05:33:13Z Mar 08 00:06:05 crc kubenswrapper[4713]: E0308 00:06:05.438719 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:05 crc kubenswrapper[4713]: W0308 00:06:05.470405 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:05Z is after 2026-02-23T05:33:13Z Mar 08 00:06:05 crc kubenswrapper[4713]: E0308 00:06:05.470519 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.482338 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:05Z is after 2026-02-23T05:33:13Z Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.655009 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.656449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.656513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.656532 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:05 crc kubenswrapper[4713]: I0308 00:06:05.657616 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:05 crc kubenswrapper[4713]: E0308 00:06:05.657986 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.482258 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:06Z is after 2026-02-23T05:33:13Z Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.572412 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:06 crc kubenswrapper[4713]: E0308 00:06:06.573257 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.573820 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.573984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.574089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.574208 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:06 crc kubenswrapper[4713]: E0308 00:06:06.579932 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:06 crc kubenswrapper[4713]: E0308 00:06:06.617009 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.657615 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.658669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.658727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.658746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:06 crc kubenswrapper[4713]: I0308 00:06:06.659648 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:06 crc kubenswrapper[4713]: E0308 00:06:06.659967 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:07 crc kubenswrapper[4713]: I0308 00:06:07.481882 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:07Z is after 2026-02-23T05:33:13Z Mar 08 00:06:08 crc kubenswrapper[4713]: I0308 00:06:08.304227 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:06:08 crc kubenswrapper[4713]: E0308 00:06:08.309921 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:08 crc kubenswrapper[4713]: I0308 00:06:08.481116 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:08Z is after 2026-02-23T05:33:13Z Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.482267 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:09Z is after 2026-02-23T05:33:13Z Mar 08 00:06:09 crc kubenswrapper[4713]: W0308 00:06:09.644782 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:09Z is after 2026-02-23T05:33:13Z Mar 08 00:06:09 crc kubenswrapper[4713]: E0308 00:06:09.644926 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.813297 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.813560 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.815185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.815235 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.815247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:09 crc kubenswrapper[4713]: I0308 00:06:09.830619 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 08 00:06:10 crc kubenswrapper[4713]: E0308 00:06:10.165462 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:10 crc kubenswrapper[4713]: I0308 00:06:10.480049 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:10Z is after 2026-02-23T05:33:13Z Mar 08 00:06:10 crc kubenswrapper[4713]: I0308 00:06:10.668912 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:10 crc kubenswrapper[4713]: I0308 00:06:10.670522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:10 crc kubenswrapper[4713]: I0308 00:06:10.670548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:10 crc kubenswrapper[4713]: I0308 00:06:10.670555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.481703 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:11Z is after 2026-02-23T05:33:13Z Mar 08 00:06:11 crc kubenswrapper[4713]: W0308 00:06:11.508577 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:11Z is after 2026-02-23T05:33:13Z Mar 08 00:06:11 crc kubenswrapper[4713]: E0308 00:06:11.508670 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.963544 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.963631 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.963687 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.963876 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.965310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.965398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.965432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.966398 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 00:06:11 crc kubenswrapper[4713]: I0308 00:06:11.966741 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc" gracePeriod=30 Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.479792 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:12Z is after 2026-02-23T05:33:13Z Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.675813 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.676288 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc" exitCode=255 Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.676322 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc"} Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.676350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc"} Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.676515 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.677550 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.677574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:12 crc kubenswrapper[4713]: I0308 00:06:12.677582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.481745 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:13Z is after 2026-02-23T05:33:13Z Mar 08 00:06:13 crc kubenswrapper[4713]: E0308 00:06:13.579029 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.580192 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.581726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.581789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.581812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:13 crc kubenswrapper[4713]: I0308 00:06:13.581935 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:13 crc kubenswrapper[4713]: E0308 00:06:13.585481 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:13 crc kubenswrapper[4713]: W0308 00:06:13.662518 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:13Z is after 2026-02-23T05:33:13Z Mar 08 00:06:13 crc kubenswrapper[4713]: E0308 00:06:13.663482 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:14 crc kubenswrapper[4713]: I0308 00:06:14.480697 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:14Z is after 2026-02-23T05:33:13Z Mar 08 00:06:15 crc kubenswrapper[4713]: I0308 00:06:15.481986 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:15Z is after 2026-02-23T05:33:13Z Mar 08 00:06:16 crc kubenswrapper[4713]: W0308 00:06:16.044879 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:16Z is after 2026-02-23T05:33:13Z Mar 08 00:06:16 crc kubenswrapper[4713]: E0308 00:06:16.044943 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.481639 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:16Z is after 2026-02-23T05:33:13Z Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.521378 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.521556 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.522937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.522973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:16 crc kubenswrapper[4713]: I0308 00:06:16.522985 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:16 crc kubenswrapper[4713]: E0308 00:06:16.617162 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:17 crc kubenswrapper[4713]: I0308 00:06:17.482541 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:17Z is after 2026-02-23T05:33:13Z Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.481998 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:18Z is after 2026-02-23T05:33:13Z Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.541018 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.542083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.542121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.542132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.542643 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.963637 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.963754 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.964636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.964690 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:18 crc kubenswrapper[4713]: I0308 00:06:18.964710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.480013 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:19Z is after 2026-02-23T05:33:13Z Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.698031 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.698650 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.700694 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" exitCode=255 Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.700733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418"} Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.700774 4713 scope.go:117] "RemoveContainer" containerID="05b59daa29e7fd595d545cd3ebde1a3cf8156265b284fab862a5365d0c362f33" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.700932 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.701758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.701938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.702492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:19 crc kubenswrapper[4713]: I0308 00:06:19.703167 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:19 crc kubenswrapper[4713]: E0308 00:06:19.703430 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:20 crc kubenswrapper[4713]: E0308 00:06:20.168707 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.483573 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:20Z is after 2026-02-23T05:33:13Z Mar 08 00:06:20 crc kubenswrapper[4713]: E0308 00:06:20.582677 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:20Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.586035 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.587702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.587754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.587771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.587805 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:20 crc kubenswrapper[4713]: E0308 00:06:20.591033 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:20 crc kubenswrapper[4713]: I0308 00:06:20.704728 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:21 crc kubenswrapper[4713]: I0308 00:06:21.482185 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:21Z is after 2026-02-23T05:33:13Z Mar 08 00:06:21 crc kubenswrapper[4713]: I0308 00:06:21.964271 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:21 crc kubenswrapper[4713]: I0308 00:06:21.964351 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:22 crc kubenswrapper[4713]: I0308 00:06:22.482209 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:22Z is after 2026-02-23T05:33:13Z Mar 08 00:06:22 crc kubenswrapper[4713]: W0308 00:06:22.757135 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:22Z is after 2026-02-23T05:33:13Z Mar 08 00:06:22 crc kubenswrapper[4713]: E0308 00:06:22.757247 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.480620 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:23Z is after 2026-02-23T05:33:13Z Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.552702 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.552952 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.554334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.554384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.554398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:23 crc kubenswrapper[4713]: I0308 00:06:23.555025 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:23 crc kubenswrapper[4713]: E0308 00:06:23.555219 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.290947 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.291222 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.292668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.292703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.292714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.293272 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:24 crc kubenswrapper[4713]: E0308 00:06:24.293448 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.421686 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:06:24 crc kubenswrapper[4713]: E0308 00:06:24.428151 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:24 crc kubenswrapper[4713]: E0308 00:06:24.429650 4713 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 08 00:06:24 crc kubenswrapper[4713]: I0308 00:06:24.479755 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:24Z is after 2026-02-23T05:33:13Z Mar 08 00:06:25 crc kubenswrapper[4713]: I0308 00:06:25.483368 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:25Z is after 2026-02-23T05:33:13Z Mar 08 00:06:26 crc kubenswrapper[4713]: I0308 00:06:26.480374 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:26Z is after 2026-02-23T05:33:13Z Mar 08 00:06:26 crc kubenswrapper[4713]: E0308 00:06:26.617362 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.479481 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:27Z is after 2026-02-23T05:33:13Z Mar 08 00:06:27 crc kubenswrapper[4713]: E0308 00:06:27.586421 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.591744 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.592858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.592880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.592889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:27 crc kubenswrapper[4713]: I0308 00:06:27.592916 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:27 crc kubenswrapper[4713]: E0308 00:06:27.595296 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:28 crc kubenswrapper[4713]: I0308 00:06:28.482422 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:28Z is after 2026-02-23T05:33:13Z Mar 08 00:06:29 crc kubenswrapper[4713]: I0308 00:06:29.479021 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:29Z is after 2026-02-23T05:33:13Z Mar 08 00:06:30 crc kubenswrapper[4713]: E0308 00:06:30.172198 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:30 crc kubenswrapper[4713]: I0308 00:06:30.479565 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:30Z is after 2026-02-23T05:33:13Z Mar 08 00:06:31 crc kubenswrapper[4713]: I0308 00:06:31.482330 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:31Z is after 2026-02-23T05:33:13Z Mar 08 00:06:31 crc kubenswrapper[4713]: I0308 00:06:31.964155 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:31 crc kubenswrapper[4713]: I0308 00:06:31.964256 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:32 crc kubenswrapper[4713]: I0308 00:06:32.483711 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:32Z is after 2026-02-23T05:33:13Z Mar 08 00:06:33 crc kubenswrapper[4713]: W0308 00:06:33.282483 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:33Z is after 2026-02-23T05:33:13Z Mar 08 00:06:33 crc kubenswrapper[4713]: E0308 00:06:33.282582 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:33 crc kubenswrapper[4713]: W0308 00:06:33.429469 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:33Z is after 2026-02-23T05:33:13Z Mar 08 00:06:33 crc kubenswrapper[4713]: E0308 00:06:33.429577 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:33 crc kubenswrapper[4713]: I0308 00:06:33.482110 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:33Z is after 2026-02-23T05:33:13Z Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.479731 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:34Z is after 2026-02-23T05:33:13Z Mar 08 00:06:34 crc kubenswrapper[4713]: E0308 00:06:34.590514 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.595833 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.597024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.597053 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.597061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:34 crc kubenswrapper[4713]: I0308 00:06:34.597081 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:34 crc kubenswrapper[4713]: E0308 00:06:34.600377 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:35 crc kubenswrapper[4713]: I0308 00:06:35.481637 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:35Z is after 2026-02-23T05:33:13Z Mar 08 00:06:35 crc kubenswrapper[4713]: W0308 00:06:35.931325 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:35Z is after 2026-02-23T05:33:13Z Mar 08 00:06:35 crc kubenswrapper[4713]: E0308 00:06:35.931456 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 00:06:36 crc kubenswrapper[4713]: I0308 00:06:36.483239 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:36Z is after 2026-02-23T05:33:13Z Mar 08 00:06:36 crc kubenswrapper[4713]: E0308 00:06:36.618268 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.261303 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.261460 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.262759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.262857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.262878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:37 crc kubenswrapper[4713]: I0308 00:06:37.482992 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:37Z is after 2026-02-23T05:33:13Z Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.480596 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:38Z is after 2026-02-23T05:33:13Z Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.540208 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.541361 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.541432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.541442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:38 crc kubenswrapper[4713]: I0308 00:06:38.541988 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:38 crc kubenswrapper[4713]: E0308 00:06:38.542146 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:39 crc kubenswrapper[4713]: I0308 00:06:39.480412 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:39Z is after 2026-02-23T05:33:13Z Mar 08 00:06:40 crc kubenswrapper[4713]: E0308 00:06:40.175952 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:40 crc kubenswrapper[4713]: I0308 00:06:40.479949 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:40Z is after 2026-02-23T05:33:13Z Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.482865 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z Mar 08 00:06:41 crc kubenswrapper[4713]: E0308 00:06:41.594096 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.601467 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.602807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.602897 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.602922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.602964 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:41 crc kubenswrapper[4713]: E0308 00:06:41.605555 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.963967 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.964064 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.964124 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.964282 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.965688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.965723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.965733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.966185 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 00:06:41 crc kubenswrapper[4713]: I0308 00:06:41.966267 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc" gracePeriod=30 Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.480236 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:42Z is after 2026-02-23T05:33:13Z Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.765876 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767052 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767431 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc" exitCode=255 Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767469 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc"} Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b"} Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767511 4713 scope.go:117] "RemoveContainer" containerID="0be7ef7bc48e87864d8d0199a68369427a1925475c7b4edd1d2554d21a165fcc" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.767643 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.768787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.768844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:42 crc kubenswrapper[4713]: I0308 00:06:42.768859 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.481781 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:43Z is after 2026-02-23T05:33:13Z Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.775373 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.777692 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.778917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.778972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:43 crc kubenswrapper[4713]: I0308 00:06:43.778994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:44 crc kubenswrapper[4713]: I0308 00:06:44.481840 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:45 crc kubenswrapper[4713]: I0308 00:06:45.480165 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.485790 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.521034 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.521495 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.522902 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.522967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:46 crc kubenswrapper[4713]: I0308 00:06:46.522993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:46 crc kubenswrapper[4713]: E0308 00:06:46.618648 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:47 crc kubenswrapper[4713]: I0308 00:06:47.482095 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.483800 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:48 crc kubenswrapper[4713]: E0308 00:06:48.601256 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.606367 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.607694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.607877 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.607983 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.608096 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:48 crc kubenswrapper[4713]: E0308 00:06:48.614612 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.963650 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.963844 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.965130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.965168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:48 crc kubenswrapper[4713]: I0308 00:06:48.965180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:49 crc kubenswrapper[4713]: I0308 00:06:49.480060 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.181516 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f00f5f5726 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,LastTimestamp:2026-03-08 00:05:46.466768678 +0000 UTC m=+0.586400951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.186575 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.191656 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.195704 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.200207 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f017db7577 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.609120631 +0000 UTC m=+0.728752864,LastTimestamp:2026-03-08 00:05:46.609120631 +0000 UTC m=+0.728752864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.204977 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.642999852 +0000 UTC m=+0.762632085,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.208584 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.643020872 +0000 UTC m=+0.762653105,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.213725 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.643029982 +0000 UTC m=+0.762662215,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.218290 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.643979895 +0000 UTC m=+0.763612168,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.222484 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.644083077 +0000 UTC m=+0.763715350,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.225965 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.644144039 +0000 UTC m=+0.763776302,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.230792 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.645050171 +0000 UTC m=+0.764682434,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.234931 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.645078241 +0000 UTC m=+0.764710504,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.239490 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.645113492 +0000 UTC m=+0.764745765,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.244047 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.64587771 +0000 UTC m=+0.765509943,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.248573 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.645892171 +0000 UTC m=+0.765524404,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.252785 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.645900161 +0000 UTC m=+0.765532394,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.256964 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.645996813 +0000 UTC m=+0.765629046,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.266298 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.646005843 +0000 UTC m=+0.765638076,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.270715 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.646013364 +0000 UTC m=+0.765645597,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.275097 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.647357416 +0000 UTC m=+0.766989659,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.279493 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.647382036 +0000 UTC m=+0.767014279,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.284518 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f01348bbdd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f01348bbdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532395997 +0000 UTC m=+0.652028230,LastTimestamp:2026-03-08 00:05:46.647394757 +0000 UTC m=+0.767027000,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.288673 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134863b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134863b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532373426 +0000 UTC m=+0.652005649,LastTimestamp:2026-03-08 00:05:46.64837604 +0000 UTC m=+0.768008313,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.292819 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ab4f0134898a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ab4f0134898a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:46.532386976 +0000 UTC m=+0.652019209,LastTimestamp:2026-03-08 00:05:46.648408311 +0000 UTC m=+0.768040574,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.297093 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0322b4f9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.050561437 +0000 UTC m=+1.170193690,LastTimestamp:2026-03-08 00:05:47.050561437 +0000 UTC m=+1.170193690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.300529 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f03243b988 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.052161416 +0000 UTC m=+1.171793649,LastTimestamp:2026-03-08 00:05:47.052161416 +0000 UTC m=+1.171793649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.303872 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f0324a4d02 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.052592386 +0000 UTC m=+1.172224619,LastTimestamp:2026-03-08 00:05:47.052592386 +0000 UTC m=+1.172224619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.306913 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f032729887 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.055233159 +0000 UTC m=+1.174865402,LastTimestamp:2026-03-08 00:05:47.055233159 +0000 UTC m=+1.174865402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.309927 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f032a9003e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.058798654 +0000 UTC m=+1.178430887,LastTimestamp:2026-03-08 00:05:47.058798654 +0000 UTC m=+1.178430887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.313399 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f05406b52e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.61858795 +0000 UTC m=+1.738220173,LastTimestamp:2026-03-08 00:05:47.61858795 +0000 UTC m=+1.738220173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.316559 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0542567a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.620599718 +0000 UTC m=+1.740231951,LastTimestamp:2026-03-08 00:05:47.620599718 +0000 UTC m=+1.740231951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.320758 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f054269aba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.62067833 +0000 UTC m=+1.740310563,LastTimestamp:2026-03-08 00:05:47.62067833 +0000 UTC m=+1.740310563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.323987 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f054324c15 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.621444629 +0000 UTC m=+1.741076872,LastTimestamp:2026-03-08 00:05:47.621444629 +0000 UTC m=+1.741076872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.327458 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f054331236 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.62149535 +0000 UTC m=+1.741127583,LastTimestamp:2026-03-08 00:05:47.62149535 +0000 UTC m=+1.741127583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.331523 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f054c9adf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.631365616 +0000 UTC m=+1.750997849,LastTimestamp:2026-03-08 00:05:47.631365616 +0000 UTC m=+1.750997849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.334898 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f054ee2795 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.633756053 +0000 UTC m=+1.753388316,LastTimestamp:2026-03-08 00:05:47.633756053 +0000 UTC m=+1.753388316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.338229 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f054f317cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.634079691 +0000 UTC m=+1.753711934,LastTimestamp:2026-03-08 00:05:47.634079691 +0000 UTC m=+1.753711934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.341716 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f054fabdcf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.634580943 +0000 UTC m=+1.754213176,LastTimestamp:2026-03-08 00:05:47.634580943 +0000 UTC m=+1.754213176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.345411 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f05514050e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.636237582 +0000 UTC m=+1.755869825,LastTimestamp:2026-03-08 00:05:47.636237582 +0000 UTC m=+1.755869825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.348719 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f0559c0278 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.645149816 +0000 UTC m=+1.764782059,LastTimestamp:2026-03-08 00:05:47.645149816 +0000 UTC m=+1.764782059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.352962 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f0695e87dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.976665053 +0000 UTC m=+2.096297326,LastTimestamp:2026-03-08 00:05:47.976665053 +0000 UTC m=+2.096297326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.355979 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f06a985b21 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.997231905 +0000 UTC m=+2.116864148,LastTimestamp:2026-03-08 00:05:47.997231905 +0000 UTC m=+2.116864148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.358998 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f06ab04f8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.998801802 +0000 UTC m=+2.118434035,LastTimestamp:2026-03-08 00:05:47.998801802 +0000 UTC m=+2.118434035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.362693 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f079b7d5f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.250953207 +0000 UTC m=+2.370585480,LastTimestamp:2026-03-08 00:05:48.250953207 +0000 UTC m=+2.370585480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.365774 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f07ab64b39 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.267629369 +0000 UTC m=+2.387261642,LastTimestamp:2026-03-08 00:05:48.267629369 +0000 UTC m=+2.387261642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.369666 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f07ad4a8b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.269619379 +0000 UTC m=+2.389251652,LastTimestamp:2026-03-08 00:05:48.269619379 +0000 UTC m=+2.389251652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.373798 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f089070d4a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.507802954 +0000 UTC m=+2.627435187,LastTimestamp:2026-03-08 00:05:48.507802954 +0000 UTC m=+2.627435187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.377676 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f08aee4a72 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.539734642 +0000 UTC m=+2.659366875,LastTimestamp:2026-03-08 00:05:48.539734642 +0000 UTC m=+2.659366875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.381951 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f08be7af61 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.556078945 +0000 UTC m=+2.675711198,LastTimestamp:2026-03-08 00:05:48.556078945 +0000 UTC m=+2.675711198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.386991 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f08c607ee1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.563996385 +0000 UTC m=+2.683628618,LastTimestamp:2026-03-08 00:05:48.563996385 +0000 UTC m=+2.683628618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.391099 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f08c7e537b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.565951355 +0000 UTC m=+2.685583588,LastTimestamp:2026-03-08 00:05:48.565951355 +0000 UTC m=+2.685583588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.395076 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f08cc91301 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.570850049 +0000 UTC m=+2.690482282,LastTimestamp:2026-03-08 00:05:48.570850049 +0000 UTC m=+2.690482282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.399083 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0974b5e83 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.747161219 +0000 UTC m=+2.866793452,LastTimestamp:2026-03-08 00:05:48.747161219 +0000 UTC m=+2.866793452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.403053 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f0975ae449 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.748178505 +0000 UTC m=+2.867810738,LastTimestamp:2026-03-08 00:05:48.748178505 +0000 UTC m=+2.867810738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.409008 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f097f91ad5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.758547157 +0000 UTC m=+2.878179390,LastTimestamp:2026-03-08 00:05:48.758547157 +0000 UTC m=+2.878179390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.413472 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0980d2781 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.759861121 +0000 UTC m=+2.879493354,LastTimestamp:2026-03-08 00:05:48.759861121 +0000 UTC m=+2.879493354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.419862 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f098117a68 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.760144488 +0000 UTC m=+2.879776731,LastTimestamp:2026-03-08 00:05:48.760144488 +0000 UTC m=+2.879776731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.425498 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f0983065d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.762170839 +0000 UTC m=+2.881803062,LastTimestamp:2026-03-08 00:05:48.762170839 +0000 UTC m=+2.881803062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.429973 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f099931cdd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.785417437 +0000 UTC m=+2.905049670,LastTimestamp:2026-03-08 00:05:48.785417437 +0000 UTC m=+2.905049670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.434872 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f099abb29e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.787028638 +0000 UTC m=+2.906660881,LastTimestamp:2026-03-08 00:05:48.787028638 +0000 UTC m=+2.906660881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.440724 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ab4f099d2ae84 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.789583492 +0000 UTC m=+2.909215725,LastTimestamp:2026-03-08 00:05:48.789583492 +0000 UTC m=+2.909215725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.445370 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f099dd3226 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.79027255 +0000 UTC m=+2.909904783,LastTimestamp:2026-03-08 00:05:48.79027255 +0000 UTC m=+2.909904783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.449808 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0a3958cce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.953349326 +0000 UTC m=+3.072981559,LastTimestamp:2026-03-08 00:05:48.953349326 +0000 UTC m=+3.072981559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.455940 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0a463f336 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.966875958 +0000 UTC m=+3.086508191,LastTimestamp:2026-03-08 00:05:48.966875958 +0000 UTC m=+3.086508191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.461818 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0a4760e54 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:48.968062548 +0000 UTC m=+3.087694781,LastTimestamp:2026-03-08 00:05:48.968062548 +0000 UTC m=+3.087694781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.465908 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0a6be8325 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.006365477 +0000 UTC m=+3.125997710,LastTimestamp:2026-03-08 00:05:49.006365477 +0000 UTC m=+3.125997710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.469020 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0a77f7a21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.019011617 +0000 UTC m=+3.138643840,LastTimestamp:2026-03-08 00:05:49.019011617 +0000 UTC m=+3.138643840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.470708 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0a798fb1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.020683039 +0000 UTC m=+3.140315272,LastTimestamp:2026-03-08 00:05:49.020683039 +0000 UTC m=+3.140315272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.474988 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0aeacc999 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.139421593 +0000 UTC m=+3.259053826,LastTimestamp:2026-03-08 00:05:49.139421593 +0000 UTC m=+3.259053826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: I0308 00:06:50.480653 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.480563 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ab4f0afcd81aa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.158343082 +0000 UTC m=+3.277975315,LastTimestamp:2026-03-08 00:05:49.158343082 +0000 UTC m=+3.277975315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.484808 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0b18a3d4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.187489099 +0000 UTC m=+3.307121332,LastTimestamp:2026-03-08 00:05:49.187489099 +0000 UTC m=+3.307121332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.488783 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0b26106e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.201565415 +0000 UTC m=+3.321197648,LastTimestamp:2026-03-08 00:05:49.201565415 +0000 UTC m=+3.321197648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.493009 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0b270e7f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.202606071 +0000 UTC m=+3.322238304,LastTimestamp:2026-03-08 00:05:49.202606071 +0000 UTC m=+3.322238304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.496951 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0ba354002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.332914178 +0000 UTC m=+3.452546421,LastTimestamp:2026-03-08 00:05:49.332914178 +0000 UTC m=+3.452546421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.501005 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0baca3641 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.342676545 +0000 UTC m=+3.462308788,LastTimestamp:2026-03-08 00:05:49.342676545 +0000 UTC m=+3.462308788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.505241 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0badca476 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.343884406 +0000 UTC m=+3.463516659,LastTimestamp:2026-03-08 00:05:49.343884406 +0000 UTC m=+3.463516659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.509135 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c4f404b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.513188529 +0000 UTC m=+3.632820762,LastTimestamp:2026-03-08 00:05:49.513188529 +0000 UTC m=+3.632820762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.515231 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c57903e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.521904609 +0000 UTC m=+3.641536842,LastTimestamp:2026-03-08 00:05:49.521904609 +0000 UTC m=+3.641536842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.520335 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f0c87e696c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.572589932 +0000 UTC m=+3.692222175,LastTimestamp:2026-03-08 00:05:49.572589932 +0000 UTC m=+3.692222175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.528401 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f0d628f1e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.801869792 +0000 UTC m=+3.921502035,LastTimestamp:2026-03-08 00:05:49.801869792 +0000 UTC m=+3.921502035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.534655 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f0d71be64b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.817792075 +0000 UTC m=+3.937424358,LastTimestamp:2026-03-08 00:05:49.817792075 +0000 UTC m=+3.937424358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.542445 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f105382195 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:50.591394197 +0000 UTC m=+4.711026470,LastTimestamp:2026-03-08 00:05:50.591394197 +0000 UTC m=+4.711026470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.548980 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f114c636fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:50.852364029 +0000 UTC m=+4.971996272,LastTimestamp:2026-03-08 00:05:50.852364029 +0000 UTC m=+4.971996272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.552752 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f11599c262 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:50.86622781 +0000 UTC m=+4.985860063,LastTimestamp:2026-03-08 00:05:50.86622781 +0000 UTC m=+4.985860063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.556401 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f115b1a417 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:50.867792919 +0000 UTC m=+4.987425192,LastTimestamp:2026-03-08 00:05:50.867792919 +0000 UTC m=+4.987425192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.560659 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f123d1f903 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.104792835 +0000 UTC m=+5.224425068,LastTimestamp:2026-03-08 00:05:51.104792835 +0000 UTC m=+5.224425068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.566395 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f124cb1cac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.121120428 +0000 UTC m=+5.240752711,LastTimestamp:2026-03-08 00:05:51.121120428 +0000 UTC m=+5.240752711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.570393 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f124e3360c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.122699788 +0000 UTC m=+5.242332051,LastTimestamp:2026-03-08 00:05:51.122699788 +0000 UTC m=+5.242332051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.572641 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f131f8fefc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.342231292 +0000 UTC m=+5.461863545,LastTimestamp:2026-03-08 00:05:51.342231292 +0000 UTC m=+5.461863545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.576463 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f1328df03b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.351992379 +0000 UTC m=+5.471624612,LastTimestamp:2026-03-08 00:05:51.351992379 +0000 UTC m=+5.471624612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.581640 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f132a64c5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.353588829 +0000 UTC m=+5.473221062,LastTimestamp:2026-03-08 00:05:51.353588829 +0000 UTC m=+5.473221062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.586208 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f13b9b71f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.503872501 +0000 UTC m=+5.623504734,LastTimestamp:2026-03-08 00:05:51.503872501 +0000 UTC m=+5.623504734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.590285 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f13c4e1eb0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.515582128 +0000 UTC m=+5.635214361,LastTimestamp:2026-03-08 00:05:51.515582128 +0000 UTC m=+5.635214361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.594276 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f13c5da731 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.516600113 +0000 UTC m=+5.636232356,LastTimestamp:2026-03-08 00:05:51.516600113 +0000 UTC m=+5.636232356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.599357 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f145f8e795 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.677769621 +0000 UTC m=+5.797401854,LastTimestamp:2026-03-08 00:05:51.677769621 +0000 UTC m=+5.797401854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.603312 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ab4f146920e65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.687806565 +0000 UTC m=+5.807438798,LastTimestamp:2026-03-08 00:05:51.687806565 +0000 UTC m=+5.807438798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.607652 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f15702dd12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.963634962 +0000 UTC m=+6.083267235,LastTimestamp:2026-03-08 00:05:51.963634962 +0000 UTC m=+6.083267235,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.611410 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f1570413e6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:51.963714534 +0000 UTC m=+6.083346797,LastTimestamp:2026-03-08 00:05:51.963714534 +0000 UTC m=+6.083346797,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.618094 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fb94478 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 00:06:50 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:50 crc kubenswrapper[4713]: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,LastTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.622352 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fbadd6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,LastTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.626666 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f33fb94478\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fb94478 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 00:06:50 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 00:06:50 crc kubenswrapper[4713]: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162870392 +0000 UTC m=+14.282502645,LastTimestamp:2026-03-08 00:06:00.170192917 +0000 UTC m=+14.289825150,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.630960 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f33fbadd6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f33fbadd6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:00.162975085 +0000 UTC m=+14.282607358,LastTimestamp:2026-03-08 00:06:00.170236628 +0000 UTC m=+14.289868861,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.634855 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0badca476\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0badca476 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.343884406 +0000 UTC m=+3.463516659,LastTimestamp:2026-03-08 00:06:00.632333569 +0000 UTC m=+14.751965802,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.638578 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0c4f404b1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c4f404b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.513188529 +0000 UTC m=+3.632820762,LastTimestamp:2026-03-08 00:06:00.850594661 +0000 UTC m=+14.970226894,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.644067 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ab4f0c57903e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ab4f0c57903e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:49.521904609 +0000 UTC m=+3.641536842,LastTimestamp:2026-03-08 00:06:00.862786469 +0000 UTC m=+14.982418702,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.650057 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.654179 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.660039 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:11.963604876 +0000 UTC m=+26.083237119,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.665314 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab193737\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:11.963655787 +0000 UTC m=+26.083288030,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.670164 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f5ff4971dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:11.966702044 +0000 UTC m=+26.086334377,LastTimestamp:2026-03-08 00:06:11.966702044 +0000 UTC m=+26.086334377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.683422 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f054ee2795\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f054ee2795 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.633756053 +0000 UTC m=+1.753388316,LastTimestamp:2026-03-08 00:06:12.084455383 +0000 UTC m=+26.204087626,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.688537 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f0695e87dd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f0695e87dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.976665053 +0000 UTC m=+2.096297326,LastTimestamp:2026-03-08 00:06:12.22150699 +0000 UTC m=+26.341139213,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.693848 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f06a985b21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f06a985b21 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:05:47.997231905 +0000 UTC m=+2.116864148,LastTimestamp:2026-03-08 00:06:12.239866785 +0000 UTC m=+26.359499058,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.700436 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:21.964333619 +0000 UTC m=+36.083965872,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.705866 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab193737\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab193737 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964320567 +0000 UTC m=+16.083952840,LastTimestamp:2026-03-08 00:06:21.96438592 +0000 UTC m=+36.084018163,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:06:50 crc kubenswrapper[4713]: E0308 00:06:50.711131 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ab4f3ab17f343\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 00:06:50 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189ab4f3ab17f343 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 00:06:50 crc kubenswrapper[4713]: body: Mar 08 00:06:50 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:06:01.964237635 +0000 UTC m=+16.083869898,LastTimestamp:2026-03-08 00:06:31.964216751 +0000 UTC m=+46.083849014,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 00:06:50 crc kubenswrapper[4713]: > Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.481552 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.964359 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 00:06:51 crc kubenswrapper[4713]: I0308 00:06:51.964408 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.482349 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.540753 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.542330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.543227 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.802615 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.804723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707"} Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.805010 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:52 crc kubenswrapper[4713]: I0308 00:06:52.806066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.481770 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.552959 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.809378 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.810133 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812538 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" exitCode=255 Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812580 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707"} Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812615 4713 scope.go:117] "RemoveContainer" containerID="ba0fec7e634640b5dace3848ee394f9c875b4ca833f93363a128e2304ef8d418" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.812635 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813568 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.813608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:53 crc kubenswrapper[4713]: I0308 00:06:53.814130 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:53 crc kubenswrapper[4713]: E0308 00:06:53.814302 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.290096 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.481309 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.816003 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818094 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.818980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.819072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:54 crc kubenswrapper[4713]: I0308 00:06:54.819714 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:54 crc kubenswrapper[4713]: E0308 00:06:54.820047 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.480427 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.606160 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.615311 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.616526 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.621920 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.819620 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.820882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:55 crc kubenswrapper[4713]: I0308 00:06:55.821325 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:06:55 crc kubenswrapper[4713]: E0308 00:06:55.821511 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.431761 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.447464 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:06:56 crc kubenswrapper[4713]: I0308 00:06:56.483587 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:56 crc kubenswrapper[4713]: E0308 00:06:56.619918 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:06:57 crc kubenswrapper[4713]: I0308 00:06:57.481127 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.482701 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.969531 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.969702 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.970787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:06:58 crc kubenswrapper[4713]: I0308 00:06:58.975590 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.480749 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.829439 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:06:59 crc kubenswrapper[4713]: I0308 00:06:59.830526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:00 crc kubenswrapper[4713]: I0308 00:07:00.482330 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.327612 4713 csr.go:261] certificate signing request csr-bj8qx is approved, waiting to be issued Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.334665 4713 csr.go:257] certificate signing request csr-bj8qx is issued Mar 08 00:07:01 crc kubenswrapper[4713]: I0308 00:07:01.388837 4713 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.314808 4713 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.336222 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 22:24:43.889191823 +0000 UTC Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.336379 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6214h17m41.552820944s for next certificate rotation Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.622685 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.623860 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.632412 4713 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.632847 4713 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.632880 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636375 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636421 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.636461 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.647941 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.654922 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.663361 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.669987 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.682043 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689400 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:02 crc kubenswrapper[4713]: I0308 00:07:02.689438 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:02Z","lastTransitionTime":"2026-03-08T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699709 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699841 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.699866 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.800264 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:02 crc kubenswrapper[4713]: E0308 00:07:02.901377 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.002239 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.102973 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.203722 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.304444 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.405371 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.505504 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.605872 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.706623 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.807259 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:03 crc kubenswrapper[4713]: E0308 00:07:03.907654 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.008717 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.109461 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.210132 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.310999 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.411966 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.513076 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.614125 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.714990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.815554 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:04 crc kubenswrapper[4713]: E0308 00:07:04.916594 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.017495 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.118011 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.218910 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.319962 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.421110 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.521725 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.622406 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.723100 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.823478 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:05 crc kubenswrapper[4713]: E0308 00:07:05.923858 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.024920 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.125706 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.225893 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.326197 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.426536 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.527090 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.620456 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.627315 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.728036 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.828856 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:06 crc kubenswrapper[4713]: E0308 00:07:06.929575 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.030157 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.130262 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.231442 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.332444 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.432571 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.533461 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.634496 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.735241 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.836100 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:07 crc kubenswrapper[4713]: E0308 00:07:07.937277 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.038388 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.139438 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.239632 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.340859 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.441004 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.541952 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.642934 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.743928 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.845086 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:08 crc kubenswrapper[4713]: E0308 00:07:08.945700 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.046656 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.147346 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.248256 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.349435 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.450655 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.540743 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.541960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542033 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:09 crc kubenswrapper[4713]: I0308 00:07:09.542999 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.543279 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.550974 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.651795 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.751933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.853009 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:09 crc kubenswrapper[4713]: E0308 00:07:09.953752 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.053925 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.154749 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.255919 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.356845 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: I0308 00:07:10.408666 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.457912 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.558912 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.659990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.760446 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.860933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:10 crc kubenswrapper[4713]: E0308 00:07:10.961103 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.062307 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.163417 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.264392 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.364790 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.464971 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.565513 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.665623 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.765989 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.867067 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:11 crc kubenswrapper[4713]: E0308 00:07:11.967621 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.068428 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.169236 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.270232 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.370450 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.471479 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.572547 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.672997 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.773601 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.874144 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.975060 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: E0308 00:07:12.993610 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997860 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:12 crc kubenswrapper[4713]: I0308 00:07:12.997905 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:12Z","lastTransitionTime":"2026-03-08T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.009568 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.014817 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.031696 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.036980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.037007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.037059 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.055348 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060795 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.060872 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:13Z","lastTransitionTime":"2026-03-08T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077081 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077304 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.077343 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.177982 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.278901 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.379757 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.480531 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.540484 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542079 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:13 crc kubenswrapper[4713]: I0308 00:07:13.542124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.581573 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.682674 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.782873 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.883025 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:13 crc kubenswrapper[4713]: E0308 00:07:13.984061 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.084888 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.185998 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.287056 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.387343 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.487501 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.588281 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.688605 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.788714 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.889729 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:14 crc kubenswrapper[4713]: E0308 00:07:14.990752 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.091416 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.192135 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.293222 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.394483 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.495163 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.596050 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.696463 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.797223 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.898333 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:15 crc kubenswrapper[4713]: E0308 00:07:15.998655 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: I0308 00:07:16.009437 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.099373 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.200234 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.301099 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.401968 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.502900 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.603907 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.621180 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.704438 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.805847 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:16 crc kubenswrapper[4713]: E0308 00:07:16.906673 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.007162 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.107522 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.208481 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.309611 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.409718 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.510945 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.611914 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.712624 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.813403 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:17 crc kubenswrapper[4713]: E0308 00:07:17.914372 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.015317 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.116132 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.216933 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.317096 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.417706 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.518877 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.619789 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.719932 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.820195 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:18 crc kubenswrapper[4713]: E0308 00:07:18.920716 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.021593 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.122675 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.223711 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.305124 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.326089 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.428844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.429572 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.506869 4713 apiserver.go:52] "Watching apiserver" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.517604 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518008 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518637 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.518897 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.519328 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.519217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.520017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.520605 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521242 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521583 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.521794 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.521726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.523759 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.523947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524144 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524275 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.524741 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.526118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.531809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532887 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.532908 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.543928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.555243 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.569617 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.580016 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.583758 4713 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.589817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.599309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.611965 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612113 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612507 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.612714 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.614895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.615741 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616013 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616417 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613795 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617343 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.613920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.614815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616274 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.616662 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617141 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617807 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.617242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618043 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618362 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618657 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618802 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618716 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.618990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619857 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.619892 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620027 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620136 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620185 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620207 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620231 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620257 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620281 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620307 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620326 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620345 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620363 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620383 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620604 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620624 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620644 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620664 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620686 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620706 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620766 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620796 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620848 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620871 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620891 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620913 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620935 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.620994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621018 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621077 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621099 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621202 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621228 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621249 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621269 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621315 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621355 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621394 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621843 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621919 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621958 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622525 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622562 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622652 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.622666 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.122643693 +0000 UTC m=+94.242276016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622698 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.622801 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.621413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623014 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623108 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623890 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.623899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624177 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624269 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624418 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624548 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624560 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624621 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624648 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624774 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624863 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.624945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625226 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625241 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625518 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625637 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625656 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625704 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625372 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.625526 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626278 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626458 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626489 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626531 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626570 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626591 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626611 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626653 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626675 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626695 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626717 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626737 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626759 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626781 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626810 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626864 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626888 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626908 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626930 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626950 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626995 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627079 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627101 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627192 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627212 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627233 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627254 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626023 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626092 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626494 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626644 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626864 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.626962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627256 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627270 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627289 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627291 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627584 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627609 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627618 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627634 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627663 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627689 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627713 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627735 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627760 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627782 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627809 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627851 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627971 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628016 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628042 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628114 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628138 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628161 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628180 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628221 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628265 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628307 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628327 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628348 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628370 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628431 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628519 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628541 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628565 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628610 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628715 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628743 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628788 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628807 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628877 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628901 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629085 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629106 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629128 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629153 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629177 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629230 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629255 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629301 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629359 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629385 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629434 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629456 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629479 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629503 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629671 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629700 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629804 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629912 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630882 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630899 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630913 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630926 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630940 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630954 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630968 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630981 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630994 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631007 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631019 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631031 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631043 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631055 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631123 4713 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631137 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631150 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631161 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631175 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631187 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631199 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631210 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631222 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631234 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631244 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631763 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631778 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631791 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631803 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631815 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631846 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631859 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631870 4713 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631883 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631904 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631916 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631928 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631941 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631954 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631966 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631981 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631993 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632005 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632017 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632029 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632042 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632053 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632065 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632077 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632088 4713 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632100 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632112 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632122 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632134 4713 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632146 4713 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632158 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632170 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632183 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632196 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632208 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632336 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632352 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632365 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632378 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632391 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632402 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632413 4713 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632430 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627392 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.627920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628020 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628070 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628207 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628313 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628322 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628471 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628533 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633348 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628855 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.628879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629337 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629353 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629353 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629483 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629671 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.629990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630051 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630063 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630178 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630249 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630397 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630574 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630935 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.630968 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631037 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631239 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631388 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631650 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631733 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631887 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.631993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632661 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632745 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632773 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.632779 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633168 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633407 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633659 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633910 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.633930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634428 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634482 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634506 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634589 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634809 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634691 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.634962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635293 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635348 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635359 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635504 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.635984 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636172 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636311 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636903 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.636983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637196 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637520 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637529 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.637806 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638017 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638015 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638513 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.638547 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.638804 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.638909 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.138894264 +0000 UTC m=+94.258526497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.639279 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.639315 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.139305584 +0000 UTC m=+94.258937917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641172 4713 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.641867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.642524 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.649056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650379 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650414 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.650923 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.651195 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652585 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652624 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652638 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652585 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652693 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652705 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652758 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.152739494 +0000 UTC m=+94.272371827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.652791 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:20.152782365 +0000 UTC m=+94.272414718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.657597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.658031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.659757 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.663211 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.663380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.664499 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.665369 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.666175 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.666208 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.654674 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.667084 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.668038 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.669752 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.669943 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670014 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670324 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.670275 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.672934 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.674707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.675458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.675945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676174 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676287 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676474 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.676859 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.678973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.679598 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.680913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681507 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681585 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.681982 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.692000 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.718387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733432 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733473 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733527 4713 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733541 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733550 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733558 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733567 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733575 4713 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733584 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733593 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733650 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733660 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733670 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733682 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733693 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733704 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733716 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733729 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733739 4713 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733767 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733776 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733785 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733792 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733800 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733808 4713 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733815 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733840 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733849 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733857 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733865 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733872 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733881 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733890 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733898 4713 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733906 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733914 4713 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733923 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733932 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733939 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733948 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733957 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733965 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733973 4713 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733981 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733990 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.733998 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734006 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734015 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734023 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734031 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734040 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734049 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734059 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734069 4713 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734077 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734084 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734093 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734101 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734109 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734117 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734125 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734133 4713 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734141 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734150 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734158 4713 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734166 4713 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734174 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734182 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734190 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734198 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734206 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734213 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734221 4713 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734230 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734238 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734246 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734254 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734262 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734270 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734278 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734286 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734294 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734301 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734310 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734318 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734326 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734334 4713 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734344 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734352 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734360 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734368 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734376 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734383 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734391 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734399 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734409 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734417 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734425 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734432 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734440 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734450 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734457 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734465 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734472 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734480 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734488 4713 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734496 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734503 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734511 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734519 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734526 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734534 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734543 4713 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734550 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734558 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734565 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734573 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734580 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734588 4713 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734600 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734608 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734616 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734624 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734632 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734640 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734647 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.734655 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.737946 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.836817 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839877 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.839949 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.843964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.850615 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.866311 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: source /etc/kubernetes/apiserver-url.env Mar 08 00:07:19 crc kubenswrapper[4713]: else Mar 08 00:07:19 crc kubenswrapper[4713]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 00:07:19 crc kubenswrapper[4713]: exit 1 Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: W0308 00:07:19.867281 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd WatchSource:0}: Error finding container df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd: Status 404 returned error can't find the container with id df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.868322 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.871396 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 00:07:19 crc kubenswrapper[4713]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 00:07:19 crc kubenswrapper[4713]: ho_enable="--enable-hybrid-overlay" Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 00:07:19 crc kubenswrapper[4713]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 00:07:19 crc kubenswrapper[4713]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-host=127.0.0.1 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-port=9743 \ Mar 08 00:07:19 crc kubenswrapper[4713]: ${ho_enable} \ Mar 08 00:07:19 crc kubenswrapper[4713]: --enable-interconnect \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-approver \ Mar 08 00:07:19 crc kubenswrapper[4713]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --wait-for-kubernetes-api=200s \ Mar 08 00:07:19 crc kubenswrapper[4713]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: W0308 00:07:19.874049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4 WatchSource:0}: Error finding container abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4: Status 404 returned error can't find the container with id abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4 Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.874552 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-webhook \ Mar 08 00:07:19 crc kubenswrapper[4713]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.876731 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.878641 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.879712 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.880251 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"401214c15d0ba80cdf8afdf54687a96d22ba11f0fa3c96749c400fe814f51eb0"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.881764 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: source /etc/kubernetes/apiserver-url.env Mar 08 00:07:19 crc kubenswrapper[4713]: else Mar 08 00:07:19 crc kubenswrapper[4713]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 00:07:19 crc kubenswrapper[4713]: exit 1 Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.883083 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"abff03d113473cb80839f92e5900297db90b9d4f4c24015e7927eb14679f57b4"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.883321 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.884202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df2b289f822bc76c592b8649c7aa06c65091beccf8d3647bc795e261789788bd"} Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.885601 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 00:07:19 crc kubenswrapper[4713]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 00:07:19 crc kubenswrapper[4713]: ho_enable="--enable-hybrid-overlay" Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 00:07:19 crc kubenswrapper[4713]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 00:07:19 crc kubenswrapper[4713]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-host=127.0.0.1 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --webhook-port=9743 \ Mar 08 00:07:19 crc kubenswrapper[4713]: ${ho_enable} \ Mar 08 00:07:19 crc kubenswrapper[4713]: --enable-interconnect \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-approver \ Mar 08 00:07:19 crc kubenswrapper[4713]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --wait-for-kubernetes-api=200s \ Mar 08 00:07:19 crc kubenswrapper[4713]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.891131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.891882 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:07:19 crc kubenswrapper[4713]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 00:07:19 crc kubenswrapper[4713]: if [[ -f "/env/_master" ]]; then Mar 08 00:07:19 crc kubenswrapper[4713]: set -o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: source "/env/_master" Mar 08 00:07:19 crc kubenswrapper[4713]: set +o allexport Mar 08 00:07:19 crc kubenswrapper[4713]: fi Mar 08 00:07:19 crc kubenswrapper[4713]: Mar 08 00:07:19 crc kubenswrapper[4713]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 00:07:19 crc kubenswrapper[4713]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 00:07:19 crc kubenswrapper[4713]: --disable-webhook \ Mar 08 00:07:19 crc kubenswrapper[4713]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 00:07:19 crc kubenswrapper[4713]: --loglevel="${LOGLEVEL}" Mar 08 00:07:19 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 00:07:19 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.892381 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.893260 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 00:07:19 crc kubenswrapper[4713]: E0308 00:07:19.893593 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.902114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.913459 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.923114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.933066 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942107 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:19Z","lastTransitionTime":"2026-03-08T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.942448 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.952232 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.961855 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.972927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.982995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:19 crc kubenswrapper[4713]: I0308 00:07:19.997682 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.006805 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043865 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.043889 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.138371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.138561 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.138543413 +0000 UTC m=+95.258175656 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.147241 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239888 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.239974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.239999 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240033 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240046 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240074 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240099 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240080003 +0000 UTC m=+95.359712236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240096 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240150 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240171 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240138755 +0000 UTC m=+95.359771028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240177 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240202 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240213 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240192116 +0000 UTC m=+95.359824509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.240252 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:21.240234447 +0000 UTC m=+95.359866680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250460 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.250537 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354152 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.354166 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.456961 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.540658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:20 crc kubenswrapper[4713]: E0308 00:07:20.540980 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.545325 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.546140 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.547129 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.547801 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.549391 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.549944 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.550651 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.551761 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.552578 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.553790 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.554644 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.555951 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.556697 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.557495 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.558258 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.559069 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.559941 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560144 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.560547 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.561367 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.562171 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.562804 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.566155 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.566792 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.568356 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.569035 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.570604 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.571541 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.572773 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.573599 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.574841 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.575523 4713 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.575669 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.578106 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.579527 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.580193 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.582612 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.584342 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.585113 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.586513 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.587862 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.589114 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.589907 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.591192 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.592049 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.593135 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.593863 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.594990 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.595965 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.597243 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.597944 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.599032 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.599704 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.600480 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.601533 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664359 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664413 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.664435 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.768795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.871952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.871991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.872023 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975528 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:20 crc kubenswrapper[4713]: I0308 00:07:20.975653 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:20Z","lastTransitionTime":"2026-03-08T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078405 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.078511 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.147204 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.147425 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.147379412 +0000 UTC m=+97.267011675 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181781 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.181871 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248579 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.248597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248711 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248783 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248799 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248810 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248723 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248896 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248855831 +0000 UTC m=+97.368488104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248890 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248938 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248918492 +0000 UTC m=+97.368550845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248958 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248974 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.248956553 +0000 UTC m=+97.368588896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.248985 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.249107 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:23.249073906 +0000 UTC m=+97.368706179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.285445 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.388345 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491163 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.491207 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.539970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.540005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.540242 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.540390 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.553866 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.554317 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.554627 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594307 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.594399 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.697135 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799856 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799870 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.799880 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.890131 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:21 crc kubenswrapper[4713]: E0308 00:07:21.890372 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:21 crc kubenswrapper[4713]: I0308 00:07:21.901551 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:21Z","lastTransitionTime":"2026-03-08T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004918 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.004978 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108665 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108763 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.108781 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.211240 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.314151 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417146 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417239 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.417258 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519859 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.519886 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.540171 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:22 crc kubenswrapper[4713]: E0308 00:07:22.540293 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622543 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.622689 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.729316 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831507 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.831568 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:22 crc kubenswrapper[4713]: I0308 00:07:22.934549 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:22Z","lastTransitionTime":"2026-03-08T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036448 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.036516 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.138718 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.164141 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.164346 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.16431716 +0000 UTC m=+101.283949383 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.240759 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.265630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265694 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265727 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265739 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265765 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265765 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265805 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265784899 +0000 UTC m=+101.385417132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265814 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265848 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265864 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265942 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265896382 +0000 UTC m=+101.385528735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.265992 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.265970214 +0000 UTC m=+101.385602487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.266037 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:27.266022865 +0000 UTC m=+101.385655358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.342762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.371989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.372023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.372032 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.388176 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394601 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.394662 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.410540 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415318 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415408 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.415456 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.431092 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.436262 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.450868 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455317 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.455337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.471727 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.471979 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.474435 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.539983 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.540011 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.540157 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:23 crc kubenswrapper[4713]: E0308 00:07:23.540287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.576916 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.679472 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.781984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.782003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.782013 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.884713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.885689 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988603 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:23 crc kubenswrapper[4713]: I0308 00:07:23.988612 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:23Z","lastTransitionTime":"2026-03-08T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.041198 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.090991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.091017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.091040 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.194639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.195477 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.298913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.299819 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.402982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.403009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.403031 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505623 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.505738 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.540841 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:24 crc kubenswrapper[4713]: E0308 00:07:24.541016 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.609113 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712633 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.712746 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.815493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.815886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.816488 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:24 crc kubenswrapper[4713]: I0308 00:07:24.919228 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:24Z","lastTransitionTime":"2026-03-08T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.022898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.023118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.023383 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.127919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.128223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.128395 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.232392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.232780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.233977 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.337749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.338990 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.442134 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.539901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:25 crc kubenswrapper[4713]: E0308 00:07:25.540073 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.539900 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:25 crc kubenswrapper[4713]: E0308 00:07:25.540806 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545433 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.545497 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.648310 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751278 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751320 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.751337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.854229 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957261 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957302 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:25 crc kubenswrapper[4713]: I0308 00:07:25.957341 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:25Z","lastTransitionTime":"2026-03-08T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060311 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.060355 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.163266 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.265979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266039 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.266103 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369645 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369933 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369962 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.369987 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.473364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.539986 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:26 crc kubenswrapper[4713]: E0308 00:07:26.540572 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.557520 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.573086 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.576963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577065 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.577085 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.591572 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.609152 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.627312 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.645819 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.661568 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.679296 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.782506 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885665 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.885800 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988970 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.988988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.989012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:26 crc kubenswrapper[4713]: I0308 00:07:26.989029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:26Z","lastTransitionTime":"2026-03-08T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091607 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.091635 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.194626 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.201923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.202093 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.202059706 +0000 UTC m=+109.321691989 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297447 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.297476 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302844 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302882 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.302915 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.302921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.302974 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.30295601 +0000 UTC m=+109.422588253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303025 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303049 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303058 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303091 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303068123 +0000 UTC m=+109.422700386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303067 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303117 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303176 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303155455 +0000 UTC m=+109.422787718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303096 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303207 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.303245 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:35.303233307 +0000 UTC m=+109.422865570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400469 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.400493 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.504613 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.505638 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.540046 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.540237 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.541062 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:27 crc kubenswrapper[4713]: E0308 00:07:27.541256 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.608614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.608998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.609938 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713596 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.713672 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.817449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.818774 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:27 crc kubenswrapper[4713]: I0308 00:07:27.921587 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:27Z","lastTransitionTime":"2026-03-08T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.023910 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.127300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229037 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229046 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.229068 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332175 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.332188 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.434969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.435172 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.537934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.537987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.538041 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.540554 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:28 crc kubenswrapper[4713]: E0308 00:07:28.540761 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.640535 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743519 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.743586 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.846967 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:28 crc kubenswrapper[4713]: I0308 00:07:28.949869 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:28Z","lastTransitionTime":"2026-03-08T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.053307 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156666 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.156730 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.259413 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.361993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.362021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.362039 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465637 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.465697 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.540107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:29 crc kubenswrapper[4713]: E0308 00:07:29.540287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.540775 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:29 crc kubenswrapper[4713]: E0308 00:07:29.540955 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568711 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.568752 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671498 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.671873 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.775279 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878528 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.878650 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.981792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.982010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:29 crc kubenswrapper[4713]: I0308 00:07:29.982155 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:29Z","lastTransitionTime":"2026-03-08T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.085931 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189586 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.189650 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292448 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.292507 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395506 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.395526 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.501260 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.541348 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:30 crc kubenswrapper[4713]: E0308 00:07:30.541775 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.604871 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.604977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.605087 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.708942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.709066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.709171 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812461 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.812687 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.914661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.915880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:30 crc kubenswrapper[4713]: I0308 00:07:30.916067 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:30Z","lastTransitionTime":"2026-03-08T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.019724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020272 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.020548 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.123880 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.226222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329361 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.329422 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432413 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432596 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.432623 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.539976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.540022 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:31 crc kubenswrapper[4713]: E0308 00:07:31.540235 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:31 crc kubenswrapper[4713]: E0308 00:07:31.540574 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.542278 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.645990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.646018 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.749181 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.853284 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:31 crc kubenswrapper[4713]: I0308 00:07:31.957528 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:31Z","lastTransitionTime":"2026-03-08T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.061793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.062040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.062267 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.165213 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.269250 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372423 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372452 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.372474 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.474932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.540515 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:32 crc kubenswrapper[4713]: E0308 00:07:32.540710 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578246 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578269 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.578320 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.681989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.682015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.682036 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.784785 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.886979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887057 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.887070 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.925334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.925388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a"} Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.941148 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.955607 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.970898 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.987928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:32 crc kubenswrapper[4713]: I0308 00:07:32.989943 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:32Z","lastTransitionTime":"2026-03-08T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.002257 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.011669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.021938 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.091989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092051 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.092082 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.194725 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296550 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.296605 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.398995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399046 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.399071 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.502108 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.540603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.540751 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.541249 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.541363 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.541793 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.541976 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.604945 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708326 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.708436 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.771542 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fp2h2"] Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.771912 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.774928 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.774965 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.775067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.796140 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.811762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.812203 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.830900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.845455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.865587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.866896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.866968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.871227 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.880922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.888162 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.893166 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.897444 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.911972 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.912646 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.915751 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.929873 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531"} Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.931917 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.936989 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.947893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.953226 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.956979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957033 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957049 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.957061 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.966656 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.968657 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34185fa0-b348-45e6-990e-4bb01410d564-hosts-file\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.973173 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: E0308 00:07:33.973299 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.974677 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:33Z","lastTransitionTime":"2026-03-08T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.983630 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:33 crc kubenswrapper[4713]: I0308 00:07:33.988717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk47b\" (UniqueName: \"kubernetes.io/projected/34185fa0-b348-45e6-990e-4bb01410d564-kube-api-access-lk47b\") pod \"node-resolver-fp2h2\" (UID: \"34185fa0-b348-45e6-990e-4bb01410d564\") " pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.001200 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.017428 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.033296 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.045705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.058984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076687 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.076816 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.093114 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fp2h2" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.109773 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34185fa0_b348_45e6_990e_4bb01410d564.slice/crio-97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49 WatchSource:0}: Error finding container 97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49: Status 404 returned error can't find the container with id 97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49 Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.153378 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4kr8v"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.156580 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fh96f"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157632 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-54zzt"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.157768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160351 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160439 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.160869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162388 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162698 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.162890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163184 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163293 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.163561 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.164304 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.174734 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.176753 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180139 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.180188 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.185765 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.198003 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.210986 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.258208 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271363 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271450 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271536 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271556 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271569 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271582 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271595 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271623 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271679 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271701 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271751 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271796 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271838 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271857 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271876 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271914 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.271986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.272292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.285084 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.297375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.312618 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.326930 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.341474 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.354868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.367135 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372588 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372605 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372624 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372640 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372705 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372737 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372777 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372808 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372812 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-os-release\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372886 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-conf-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372909 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372936 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-system-cni-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cnibin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cnibin\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372980 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373006 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-rootfs\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-system-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373053 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373068 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373771 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373835 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-cni-dir\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373149 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-socket-dir-parent\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-etc-kubernetes\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-kubelet\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373070 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-k8s-cni-cncf-io\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-os-release\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-mcd-auth-proxy-config\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-binary-copy\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-cni-binary-copy\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373741 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-netns\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373963 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-hostroot\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.372754 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-run-multus-certs\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.374327 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-multus\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.373122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf95e3f7-808b-434f-8fd4-c7e7365a1561-host-var-lib-cni-bin\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.375179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf95e3f7-808b-434f-8fd4-c7e7365a1561-multus-daemon-config\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.377498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.378209 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-proxy-tls\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.380188 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387336 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.387364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.388107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-928t2\" (UniqueName: \"kubernetes.io/projected/d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205-kube-api-access-928t2\") pod \"multus-additional-cni-plugins-54zzt\" (UID: \"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\") " pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.393947 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.396405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmxl\" (UniqueName: \"kubernetes.io/projected/5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76-kube-api-access-zlmxl\") pod \"machine-config-daemon-4kr8v\" (UID: \"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\") " pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.397150 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9p9\" (UniqueName: \"kubernetes.io/projected/bf95e3f7-808b-434f-8fd4-c7e7365a1561-kube-api-access-bv9p9\") pod \"multus-fh96f\" (UID: \"bf95e3f7-808b-434f-8fd4-c7e7365a1561\") " pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.406610 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.418064 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.429100 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.440692 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.450182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.461410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.476252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.487033 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490037 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490086 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.490435 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fh96f" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.499581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.507387 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54zzt" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.518985 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf95e3f7_808b_434f_8fd4_c7e7365a1561.slice/crio-d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542 WatchSource:0}: Error finding container d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542: Status 404 returned error can't find the container with id d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542 Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.532091 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7dbbe8c_4ae1_4a6b_9b62_eac6a5c73205.slice/crio-93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1 WatchSource:0}: Error finding container 93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1: Status 404 returned error can't find the container with id 93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1 Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.541169 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:34 crc kubenswrapper[4713]: E0308 00:07:34.541298 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.559551 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.560332 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564282 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564600 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.564929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565192 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565535 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565578 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.565611 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.576570 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.589096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.592536 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.600960 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.611547 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.628089 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.642224 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.655130 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.668900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676113 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676130 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676163 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676186 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676201 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676216 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676284 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676402 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676542 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676577 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676615 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676651 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676754 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676809 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.676868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.683777 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.694607 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.696657 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.714205 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.743608 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777416 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777488 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777553 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777603 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777689 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777704 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777723 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777761 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777793 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777902 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777942 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777952 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777993 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.777996 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778114 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778333 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778702 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778193 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.778997 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.779039 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.783652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796675 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.796711 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.800108 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"ovnkube-node-gsfft\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899060 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899128 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.899138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:34Z","lastTransitionTime":"2026-03-08T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.933714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.933764 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"93cf361bb8ca9fd708c5a2d407009e480d619b1eb23e60fab80652ad44ce55a1"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.935776 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fp2h2" event={"ID":"34185fa0-b348-45e6-990e-4bb01410d564","Type":"ContainerStarted","Data":"edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.935861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fp2h2" event={"ID":"34185fa0-b348-45e6-990e-4bb01410d564","Type":"ContainerStarted","Data":"97f7ff49b6fee4f7a5ed851a9363423614f03c188a5f1171e72af244bf688d49"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937623 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937669 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.937682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c3ada8a6a2b79759353dfd8087cd376ccb54b5781a552e2c181132bd8987a990"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.939059 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.939087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"d52720c7b61f103d964e37454c76bb3c47479686b9097705fdcc71ba15fa3542"} Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.952957 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.965319 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.965519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: I0308 00:07:34.977965 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:34 crc kubenswrapper[4713]: W0308 00:07:34.979705 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d WatchSource:0}: Error finding container 6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d: Status 404 returned error can't find the container with id 6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001775 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.001813 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.005392 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.029778 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.043409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.055147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.066900 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.079281 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.094869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.103977 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.118636 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.134050 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.153662 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.174664 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.191434 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.206710 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.207508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.221337 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.234921 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.248688 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.263607 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.279978 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.282525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.282739 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.282691819 +0000 UTC m=+125.402324052 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.292648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.305687 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309527 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.309573 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.318648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383569 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383692 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383777 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383816 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383850 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.383815 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383919 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.383900541 +0000 UTC m=+125.503532774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383923 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383969 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.383959293 +0000 UTC m=+125.503591526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.383924 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384078 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384076 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384128 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384193 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.384154777 +0000 UTC m=+125.503787160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.384228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.384211109 +0000 UTC m=+125.503843582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.412599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.515935 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.540317 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.540333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.540633 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:35 crc kubenswrapper[4713]: E0308 00:07:35.540779 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.620333 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.722990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.723004 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825549 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.825587 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.929338 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:35Z","lastTransitionTime":"2026-03-08T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942139 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" exitCode=0 Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942204 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.942229 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.943679 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425" exitCode=0 Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.944156 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425"} Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.962263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.987103 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:35 crc kubenswrapper[4713]: I0308 00:07:35.999258 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.009626 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.022445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.048974 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.060385 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.086850 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.103395 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.115202 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.123340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.133272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.148454 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.158979 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.162162 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.171085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.184724 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.199068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.213967 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.226082 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.242973 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.255251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.264668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.266867 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.276860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.290839 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.308044 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.367112 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469040 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.469061 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.540731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:36 crc kubenswrapper[4713]: E0308 00:07:36.541046 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.562785 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572813 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.572885 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.583385 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.603587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.615555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.635762 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.650934 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.664229 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.674789 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.677955 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.692870 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.704492 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.716444 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.727431 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.776891 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.777138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.880364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.950938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951625 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.951973 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.952901 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79" exitCode=0 Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.952949 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.974860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982107 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.982122 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:36Z","lastTransitionTime":"2026-03-08T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.989299 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:36 crc kubenswrapper[4713]: I0308 00:07:36.998791 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.012027 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.027702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.041175 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.053128 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.066854 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.078375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.084753 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.091773 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.137198 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.164189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.187364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295402 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.295414 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399872 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.399921 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.504994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.505099 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.540645 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.540728 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:37 crc kubenswrapper[4713]: E0308 00:07:37.540819 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:37 crc kubenswrapper[4713]: E0308 00:07:37.541025 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608271 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608405 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.608428 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.710982 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.812794 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915243 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915279 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.915314 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:37Z","lastTransitionTime":"2026-03-08T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.957244 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2" exitCode=0 Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.957289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2"} Mar 08 00:07:37 crc kubenswrapper[4713]: I0308 00:07:37.979928 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:37Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.004896 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.017660 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.024366 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.036435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.050014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.060039 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.072906 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.085147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.096014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.106671 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.120561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.122176 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.132746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228194 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.228265 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330048 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.330088 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433135 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.433171 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536664 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536721 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.536804 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.540121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:38 crc kubenswrapper[4713]: E0308 00:07:38.540284 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640401 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.640494 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.743645 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846135 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.846181 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.948998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.949017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.949031 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:38Z","lastTransitionTime":"2026-03-08T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.964295 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625" exitCode=0 Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.964398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.966018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.972935 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.976518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:38 crc kubenswrapper[4713]: I0308 00:07:38.994453 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:38Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.010317 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.022578 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.038559 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.057343 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.058691 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.069735 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.081468 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.095394 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.108899 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.125491 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.137704 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.148612 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.156971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.161200 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.171894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.182676 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.194373 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.204647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.215144 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.225879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.239292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.254519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.262997 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.263015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.263025 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.267505 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.278099 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.365947 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468652 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.468664 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.540717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.540723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:39 crc kubenswrapper[4713]: E0308 00:07:39.540876 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:39 crc kubenswrapper[4713]: E0308 00:07:39.540955 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.554182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.570786 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673189 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.673223 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.774793 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.877321 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.977577 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee" exitCode=0 Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.977659 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee"} Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978757 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978769 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:39 crc kubenswrapper[4713]: I0308 00:07:39.978795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:39Z","lastTransitionTime":"2026-03-08T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.000481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.016770 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.030422 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.042369 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.052971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.066613 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.082986 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.083073 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.087298 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.100219 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.116777 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.130042 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.142697 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.160959 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.176096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.186135 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.289222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391235 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391249 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.391260 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.493922 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.540988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:40 crc kubenswrapper[4713]: E0308 00:07:40.541127 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.596968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.597076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.597152 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.699644 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.802183 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.827976 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-d9bpk"] Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.828549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837456 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837521 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.837645 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.838165 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845092 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845144 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.845192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.853409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.869666 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.882318 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.891605 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904029 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904440 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.904473 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:40Z","lastTransitionTime":"2026-03-08T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.915480 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.926067 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.937199 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.945897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.945949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946191 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23406c9e-4ba0-4b59-a360-fb325a1adb0b-host\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.946995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23406c9e-4ba0-4b59-a360-fb325a1adb0b-serviceca\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.949498 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.962575 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.969456 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7qj\" (UniqueName: \"kubernetes.io/projected/23406c9e-4ba0-4b59-a360-fb325a1adb0b-kube-api-access-5r7qj\") pod \"node-ca-d9bpk\" (UID: \"23406c9e-4ba0-4b59-a360-fb325a1adb0b\") " pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.976309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.984384 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205" containerID="3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208" exitCode=0 Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.984456 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerDied","Data":"3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208"} Mar 08 00:07:40 crc kubenswrapper[4713]: I0308 00:07:40.989344 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.008373 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.009514 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.022993 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.037383 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.049188 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.059503 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.074502 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.087932 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.115969 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116258 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.116368 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.144811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.148274 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9bpk" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.164075 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.175752 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.188551 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.200733 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.213166 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.219219 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.231894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.243405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.320760 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422647 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.422672 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.524430 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.540686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.540735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:41 crc kubenswrapper[4713]: E0308 00:07:41.540808 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:41 crc kubenswrapper[4713]: E0308 00:07:41.540999 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.626932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.728805 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.831916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.831993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.832083 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934049 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.934122 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:41Z","lastTransitionTime":"2026-03-08T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.991846 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" event={"ID":"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205","Type":"ContainerStarted","Data":"03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.994435 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9bpk" event={"ID":"23406c9e-4ba0-4b59-a360-fb325a1adb0b","Type":"ContainerStarted","Data":"0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352"} Mar 08 00:07:41 crc kubenswrapper[4713]: I0308 00:07:41.994507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9bpk" event={"ID":"23406c9e-4ba0-4b59-a360-fb325a1adb0b","Type":"ContainerStarted","Data":"d50b9cebf0a75336d3c988668e019e91bfc640c20e72aed0928a601696b242cd"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000306 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000732 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000787 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.000805 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.012671 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.029107 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.030255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037314 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037355 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.037372 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.042531 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.055758 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.069345 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.083271 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.093269 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.123997 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.136414 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.139501 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.151375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.164879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.182227 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.204259 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.219102 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.235446 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.242124 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.260430 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.272329 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.286811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.300371 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.319186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.331991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345532 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.345550 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.352323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.368578 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.386090 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.399240 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.422650 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.445169 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.448131 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.459123 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.540675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:42 crc kubenswrapper[4713]: E0308 00:07:42.540842 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.549977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550051 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.550078 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652652 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.652744 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755251 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.755289 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.857701 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960319 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960359 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:42 crc kubenswrapper[4713]: I0308 00:07:42.960388 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:42Z","lastTransitionTime":"2026-03-08T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.064541 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.167809 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.270589 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372233 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.372263 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.474154 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.540536 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.540621 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:43 crc kubenswrapper[4713]: E0308 00:07:43.540647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:43 crc kubenswrapper[4713]: E0308 00:07:43.540729 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576468 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.576548 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.678690 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.781136 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.884153 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985343 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985363 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:43 crc kubenswrapper[4713]: I0308 00:07:43.985402 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:43Z","lastTransitionTime":"2026-03-08T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.005987 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.008815 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.011306 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.014408 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" exitCode=1 Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.014486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.015699 4713 scope.go:117] "RemoveContainer" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.029265 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.033992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.034101 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.045501 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.057313 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.061783 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.065649 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.075431 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081252 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.081232 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.096534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.098810 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.099721 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102173 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.102219 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.109680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.122386 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.145292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.159377 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.172209 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.195497 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.205137 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.211031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.224691 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.239027 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.251212 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307575 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.307585 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410267 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.410300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512368 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512404 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.512437 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.541706 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:44 crc kubenswrapper[4713]: E0308 00:07:44.541886 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.545601 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.614749 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.716990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.717002 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819590 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819662 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.819686 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922239 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:44 crc kubenswrapper[4713]: I0308 00:07:44.922309 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:44Z","lastTransitionTime":"2026-03-08T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.019913 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.023398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.023796 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.024341 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.025969 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.027864 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.028093 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.039931 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.058451 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.073218 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.086925 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.096263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.105140 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.120681 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.125819 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.129304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.148479 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.162959 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.177653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.187702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.197816 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.210200 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.223806 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.228293 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.239883 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.254509 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.265419 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.276577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.286553 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.306487 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.319602 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338054 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338107 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.338115 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.344503 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.358305 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.372318 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.388555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.400110 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.418400 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440862 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440870 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.440894 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.540278 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:45 crc kubenswrapper[4713]: E0308 00:07:45.540594 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.540281 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:45 crc kubenswrapper[4713]: E0308 00:07:45.540692 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542798 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.542866 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.644982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.645010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.645029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747673 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.747767 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.850762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953531 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:45 crc kubenswrapper[4713]: I0308 00:07:45.953603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:45Z","lastTransitionTime":"2026-03-08T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.032601 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.033230 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/0.log" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.035962 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" exitCode=1 Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.035999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.036060 4713 scope.go:117] "RemoveContainer" containerID="c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.036789 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.037009 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.056988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057038 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057067 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.057262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.066760 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.081995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.095317 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.108675 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.126977 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.140125 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.150762 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.158995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.159323 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.160117 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.180647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.194667 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.214522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.226138 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.239091 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.261735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.261959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.262349 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365553 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.365632 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.467782 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:46Z","lastTransitionTime":"2026-03-08T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.540072 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.540317 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.568305 4713 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.576390 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.596088 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.611078 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.634653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: E0308 00:07:46.636565 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.678079 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.693096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.704452 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.716513 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.726786 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.740359 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.754456 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r"] Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.755250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.757026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.757338 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.763927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.777564 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.794875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.800930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.808340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.818486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.831282 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.843799 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.856536 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.866999 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.876492 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.894031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901697 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.901729 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.902489 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.902679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f22c2d7-0e3d-4132-b548-87e98062c766-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.910044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f22c2d7-0e3d-4132-b548-87e98062c766-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.914455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.917721 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vn4\" (UniqueName: \"kubernetes.io/projected/2f22c2d7-0e3d-4132-b548-87e98062c766-kube-api-access-x9vn4\") pod \"ovnkube-control-plane-749d76644c-r2j6r\" (UID: \"2f22c2d7-0e3d-4132-b548-87e98062c766\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.927216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.944856 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.961439 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.973413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:46 crc kubenswrapper[4713]: I0308 00:07:46.985714 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.003118 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6d9f665f4f27521614ebef412c48d8a6f29342a3069580cd12dda0a3ba9d254\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:43Z\\\",\\\"message\\\":\\\"ler 8 for removal\\\\nI0308 00:07:43.841366 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 00:07:43.841403 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 00:07:43.841421 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 00:07:43.841468 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 00:07:43.841519 6520 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 00:07:43.841534 6520 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 00:07:43.841601 6520 factory.go:656] Stopping watch factory\\\\nI0308 00:07:43.841636 6520 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 00:07:43.841666 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 00:07:43.841682 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 00:07:43.841702 6520 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 00:07:43.841706 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 00:07:43.841402 6520 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 00:07:43.841690 6520 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 00:07:43.841715 6520 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:46Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.021588 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.042252 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.047140 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.047405 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.067141 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.068579 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" Mar 08 00:07:47 crc kubenswrapper[4713]: W0308 00:07:47.089888 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f22c2d7_0e3d_4132_b548_87e98062c766.slice/crio-654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c WatchSource:0}: Error finding container 654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c: Status 404 returned error can't find the container with id 654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.092589 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.113631 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.128924 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.150365 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.169537 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.180147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.198431 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.210945 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.225102 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.238081 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.259031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.272020 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.284618 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.304417 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.481284 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.481798 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.481922 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.494619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.507872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.507938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.511896 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.524869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.540339 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.540486 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.540557 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.541324 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.551863 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.560129 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.569427 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.580248 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.589879 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.597773 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.605871 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.608589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.608646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.608762 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:47 crc kubenswrapper[4713]: E0308 00:07:47.608814 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:48.108799995 +0000 UTC m=+122.228432228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.624385 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.625657 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2sp\" (UniqueName: \"kubernetes.io/projected/02de296b-0485-4f21-abf9-51043545b565-kube-api-access-lp2sp\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.636030 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.645629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.655763 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:47 crc kubenswrapper[4713]: I0308 00:07:47.671873 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:47Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.050907 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.051002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.051031 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" event={"ID":"2f22c2d7-0e3d-4132-b548-87e98062c766","Type":"ContainerStarted","Data":"654a257cd2697566e8ce6feadc8783519d6605552a2ca92a65bdf57e3a1b080c"} Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.071872 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.093541 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.107475 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.114093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.114373 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.114491 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:49.114458266 +0000 UTC m=+123.234090559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.128962 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.150018 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.164798 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.178513 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.196624 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.220145 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.239617 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.274245 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.291233 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.313806 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.336091 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.352346 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.372364 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:48Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:48 crc kubenswrapper[4713]: I0308 00:07:48.540749 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:48 crc kubenswrapper[4713]: E0308 00:07:48.540910 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.123966 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.124454 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.124526 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:51.124503456 +0000 UTC m=+125.244135719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540520 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540573 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:49 crc kubenswrapper[4713]: I0308 00:07:49.540530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540720 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540790 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:49 crc kubenswrapper[4713]: E0308 00:07:49.540873 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:50 crc kubenswrapper[4713]: I0308 00:07:50.540706 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:50 crc kubenswrapper[4713]: E0308 00:07:50.541507 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.144361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.144539 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.144638 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:07:55.144611726 +0000 UTC m=+129.264243999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.346363 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.346484 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.346459146 +0000 UTC m=+157.466091389 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448255 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448390 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448513 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448556 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448576 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448614 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448630 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448666 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448681 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448685 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448659993 +0000 UTC m=+157.568292256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448719 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448703394 +0000 UTC m=+157.568335667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448767 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448733155 +0000 UTC m=+157.568365418 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.448796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448919 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.448981 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:23.448967811 +0000 UTC m=+157.568600084 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539906 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.539973 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:51 crc kubenswrapper[4713]: I0308 00:07:51.539982 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.540096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.540304 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:51 crc kubenswrapper[4713]: E0308 00:07:51.637572 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:52 crc kubenswrapper[4713]: I0308 00:07:52.540335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:52 crc kubenswrapper[4713]: E0308 00:07:52.540499 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540394 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540430 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:53 crc kubenswrapper[4713]: I0308 00:07:53.540497 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540543 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540736 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:53 crc kubenswrapper[4713]: E0308 00:07:53.540810 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211711 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.211769 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.230804 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235231 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.235255 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.254364 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.257963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.257999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.258037 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.271319 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275441 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.275450 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.294417 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.298697 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:07:54Z","lastTransitionTime":"2026-03-08T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.312676 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.312946 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:07:54 crc kubenswrapper[4713]: I0308 00:07:54.540333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:54 crc kubenswrapper[4713]: E0308 00:07:54.540470 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.184027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.184224 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.184297 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:03.184278713 +0000 UTC m=+137.303910956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540534 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540667 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540566 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:55 crc kubenswrapper[4713]: I0308 00:07:55.540552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540732 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:55 crc kubenswrapper[4713]: E0308 00:07:55.540859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.540375 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:56 crc kubenswrapper[4713]: E0308 00:07:56.540503 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.559566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.577519 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.590076 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.603787 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.623721 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.636137 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: E0308 00:07:56.639137 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.647873 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.664266 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.675922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.686182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.697195 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.706166 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.719308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.737760 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.750008 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:56 crc kubenswrapper[4713]: I0308 00:07:56.761363 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540229 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540404 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540489 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540548 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:07:57 crc kubenswrapper[4713]: I0308 00:07:57.540593 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:57 crc kubenswrapper[4713]: E0308 00:07:57.540658 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:58 crc kubenswrapper[4713]: I0308 00:07:58.540373 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:07:58 crc kubenswrapper[4713]: E0308 00:07:58.540586 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.540975 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.541034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:07:59 crc kubenswrapper[4713]: I0308 00:07:59.541005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541217 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541323 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:07:59 crc kubenswrapper[4713]: E0308 00:07:59.541417 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:00 crc kubenswrapper[4713]: I0308 00:08:00.540681 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:00 crc kubenswrapper[4713]: E0308 00:08:00.540961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539894 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539932 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:01 crc kubenswrapper[4713]: I0308 00:08:01.539966 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540026 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540105 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.540203 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:01 crc kubenswrapper[4713]: E0308 00:08:01.640762 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:02 crc kubenswrapper[4713]: I0308 00:08:02.541154 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:02 crc kubenswrapper[4713]: E0308 00:08:02.541559 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:02 crc kubenswrapper[4713]: I0308 00:08:02.542010 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.112917 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.119221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406"} Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.119804 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.138665 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.148426 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.159143 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.171437 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.182035 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.196649 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.207552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.207868 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.208007 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:19.207976373 +0000 UTC m=+153.327608616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.208614 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.223268 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.239037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.253039 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.267983 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.287028 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.307811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.318097 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.333410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.351190 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540389 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540426 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.540461 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540523 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540630 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:03 crc kubenswrapper[4713]: E0308 00:08:03.540748 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.555312 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.568813 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.579314 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.588800 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.598331 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.605633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.618889 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.629025 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.644680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.657189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.666517 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.715668 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.730696 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.741278 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.750767 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.760669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:03 crc kubenswrapper[4713]: I0308 00:08:03.775561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.123460 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.124103 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/1.log" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126176 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" exitCode=1 Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126211 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406"} Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.126275 4713 scope.go:117] "RemoveContainer" containerID="6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.127005 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.127192 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.148754 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.163620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.175920 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.186635 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.197810 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.215533 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.230043 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.239789 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.257063 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6671774763c93ece42b41231cc5119077b6c78c0681c42dfc8247d5f6ce2426c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:07:45Z\\\",\\\"message\\\":\\\"41Z]\\\\nI0308 00:07:45.134678 6677 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756f9fb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFami\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.269716 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.281478 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.291108 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.300719 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.308859 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.320145 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.329744 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.540495 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.541025 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.555130 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569975 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569985 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.569999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.570010 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.583189 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.585952 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.595959 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.599282 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.610022 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.613161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.624684 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627864 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:04 crc kubenswrapper[4713]: I0308 00:08:04.627934 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:04Z","lastTransitionTime":"2026-03-08T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.638250 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:04 crc kubenswrapper[4713]: E0308 00:08:04.638362 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.134629 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.138602 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.138759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.152113 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.171749 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.188985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.199115 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.211342 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.226707 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.239428 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.250764 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.267705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.281111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.302348 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.315685 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.327364 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.340953 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.356459 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.370272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.382893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540302 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540472 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540562 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:05 crc kubenswrapper[4713]: I0308 00:08:05.540728 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:05 crc kubenswrapper[4713]: E0308 00:08:05.540883 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.540303 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:06 crc kubenswrapper[4713]: E0308 00:08:06.540660 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.552522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.562064 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.575958 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.587435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.598041 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.609481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.621022 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.631225 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.640231 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: E0308 00:08:06.641666 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.649085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.675304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.687556 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.697908 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.715067 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.727111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.739367 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:06 crc kubenswrapper[4713]: I0308 00:08:06.749585 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.541998 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542468 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.542164 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:07 crc kubenswrapper[4713]: I0308 00:08:07.542276 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542645 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:07 crc kubenswrapper[4713]: E0308 00:08:07.542705 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:08 crc kubenswrapper[4713]: I0308 00:08:08.540164 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:08 crc kubenswrapper[4713]: E0308 00:08:08.540384 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.540761 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:09 crc kubenswrapper[4713]: I0308 00:08:09.540575 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.541002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:09 crc kubenswrapper[4713]: E0308 00:08:09.541090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:10 crc kubenswrapper[4713]: I0308 00:08:10.540310 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:10 crc kubenswrapper[4713]: E0308 00:08:10.540537 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540655 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.540813 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:11 crc kubenswrapper[4713]: I0308 00:08:11.540933 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.541133 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.541163 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:11 crc kubenswrapper[4713]: E0308 00:08:11.643268 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:12 crc kubenswrapper[4713]: I0308 00:08:12.540764 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:12 crc kubenswrapper[4713]: E0308 00:08:12.541084 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540721 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540775 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.540871 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:13 crc kubenswrapper[4713]: I0308 00:08:13.540799 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.540970 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:13 crc kubenswrapper[4713]: E0308 00:08:13.541026 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:14 crc kubenswrapper[4713]: I0308 00:08:14.540132 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:14 crc kubenswrapper[4713]: E0308 00:08:14.540274 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.010578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.011640 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.033042 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036784 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.036795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.048729 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052335 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052491 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.052588 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.065508 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.068885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.069244 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.082878 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087208 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.087234 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:15Z","lastTransitionTime":"2026-03-08T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.102093 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.102210 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540320 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.540654 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.540928 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:15 crc kubenswrapper[4713]: I0308 00:08:15.540972 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:15 crc kubenswrapper[4713]: E0308 00:08:15.541176 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.540133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:16 crc kubenswrapper[4713]: E0308 00:08:16.540320 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.561919 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.593308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.612239 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.626803 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.641711 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: E0308 00:08:16.644417 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.656569 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.671358 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.681309 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.703560 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.716743 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.728543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.738874 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.763984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.785488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.796004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.806976 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:16 crc kubenswrapper[4713]: I0308 00:08:16.817669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540493 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.540686 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.540802 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.541378 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.541466 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:17 crc kubenswrapper[4713]: I0308 00:08:17.542024 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:17 crc kubenswrapper[4713]: E0308 00:08:17.542291 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:18 crc kubenswrapper[4713]: I0308 00:08:18.540398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:18 crc kubenswrapper[4713]: E0308 00:08:18.540533 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.269806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.269993 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.270076 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:08:51.270059668 +0000 UTC m=+185.389691901 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:19 crc kubenswrapper[4713]: I0308 00:08:19.540522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540642 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540884 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:19 crc kubenswrapper[4713]: E0308 00:08:19.540961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:20 crc kubenswrapper[4713]: I0308 00:08:20.540517 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:20 crc kubenswrapper[4713]: E0308 00:08:20.540677 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540408 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540548 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:21 crc kubenswrapper[4713]: I0308 00:08:21.540435 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540771 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.540882 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:21 crc kubenswrapper[4713]: E0308 00:08:21.646181 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198182 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198276 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" exitCode=1 Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.198324 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2"} Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.199016 4713 scope.go:117] "RemoveContainer" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.214226 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.247639 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.266891 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.281194 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.292221 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.306234 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.318056 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.332262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.344775 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.362463 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.375940 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.390759 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.404469 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.415349 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.426871 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.435914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.448342 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:22 crc kubenswrapper[4713]: I0308 00:08:22.540549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:22 crc kubenswrapper[4713]: E0308 00:08:22.540804 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.202773 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.202840 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f"} Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.220284 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.234260 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.246543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.260415 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.272252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.296577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.308702 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.319699 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.338543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.358583 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.373037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.384141 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.396263 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.404452 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.404553 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.404531333 +0000 UTC m=+221.524163566 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.410790 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.423510 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.437703 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.450147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.505985 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.506126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506143 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506208048 +0000 UTC m=+221.625840281 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506244 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506285 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506325 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.50628544 +0000 UTC m=+221.625917753 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506340 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506423 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506469 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506487 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506517 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506484385 +0000 UTC m=+221.626116658 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.506546 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:27.506530886 +0000 UTC m=+221.626163129 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540044 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540070 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540182 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:23 crc kubenswrapper[4713]: I0308 00:08:23.540264 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540388 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:23 crc kubenswrapper[4713]: E0308 00:08:23.540545 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:24 crc kubenswrapper[4713]: I0308 00:08:24.541073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:24 crc kubenswrapper[4713]: E0308 00:08:24.541288 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.395376 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.417152 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.423977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.424002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.424019 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.446982 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.455134 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.469209 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474676 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.474930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.475033 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.489944 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.494582 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:25Z","lastTransitionTime":"2026-03-08T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.507979 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.508111 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540541 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540567 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:25 crc kubenswrapper[4713]: I0308 00:08:25.540541 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540730 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:25 crc kubenswrapper[4713]: E0308 00:08:25.540777 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.540640 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:26 crc kubenswrapper[4713]: E0308 00:08:26.541088 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.553904 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.555524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.588340 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.605478 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.623646 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: E0308 00:08:26.647753 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.648891 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.667415 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.687432 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.705234 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.720131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.738508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.758916 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.776304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.791041 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.823084 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.842289 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.854892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:26 crc kubenswrapper[4713]: I0308 00:08:26.866754 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540748 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:27 crc kubenswrapper[4713]: I0308 00:08:27.540813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541060 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541310 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:27 crc kubenswrapper[4713]: E0308 00:08:27.541471 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:28 crc kubenswrapper[4713]: I0308 00:08:28.540441 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:28 crc kubenswrapper[4713]: E0308 00:08:28.540646 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540321 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:29 crc kubenswrapper[4713]: I0308 00:08:29.540382 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541152 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541255 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:29 crc kubenswrapper[4713]: E0308 00:08:29.541002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:30 crc kubenswrapper[4713]: I0308 00:08:30.540757 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:30 crc kubenswrapper[4713]: E0308 00:08:30.541208 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:30 crc kubenswrapper[4713]: I0308 00:08:30.553886 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.540751 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540586 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.540576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.541209 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:31 crc kubenswrapper[4713]: I0308 00:08:31.541392 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.541424 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:31 crc kubenswrapper[4713]: E0308 00:08:31.650022 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.234765 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.237965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e"} Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.238407 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.254423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.268241 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.278629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.295644 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.307746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.323619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.338561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.350907 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.361633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.371216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.380248 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.390075 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.407433 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.416763 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.430680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.461658 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.480207 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.496450 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.514445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:32 crc kubenswrapper[4713]: I0308 00:08:32.540723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:32 crc kubenswrapper[4713]: E0308 00:08:32.540927 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.242860 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.243600 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/2.log" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246344 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" exitCode=1 Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e"} Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.246427 4713 scope.go:117] "RemoveContainer" containerID="5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.247141 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.247322 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.263534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.274383 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.298292 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.318252 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.334345 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.352328 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.371892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.391679 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.412004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.429405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.444000 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.472324 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.487096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.501290 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.522223 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f6934a55a247f619f691c42c3ed91f8f29bbadc8a6f725435d9de70fe5da406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:03Z is after 2025-08-24T17:21:41Z]\\\\nI0308 00:08:03.483432 7001 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 00:08:03.483433 7001 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-fp2h2\\\\nI0308 00:08:03.483436 7001 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 00:08:03.483440 7001 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 00:08:03.483444 7001 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.534695 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540377 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.540399 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540503 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540599 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:33 crc kubenswrapper[4713]: E0308 00:08:33.540699 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.548872 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.562423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:33 crc kubenswrapper[4713]: I0308 00:08:33.573756 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.251925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.255692 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:34 crc kubenswrapper[4713]: E0308 00:08:34.256017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.275405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.294186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.310455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.325425 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.342576 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.356936 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.367512 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.380008 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.398929 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.412678 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.425592 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.436952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.448404 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.462238 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.472202 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.489121 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.514111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.529587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.539630 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:34 crc kubenswrapper[4713]: I0308 00:08:34.540938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:34 crc kubenswrapper[4713]: E0308 00:08:34.541080 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.540725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541340 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.541020 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541461 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.540750 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.541562 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727553 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.727621 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.742239 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.746922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.747069 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.761594 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765259 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.765353 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.779388 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.783678 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.803006 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.806960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.806995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:35 crc kubenswrapper[4713]: I0308 00:08:35.807038 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:35Z","lastTransitionTime":"2026-03-08T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.825932 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e399c248-6394-463b-9421-3cdd5fff0be8\\\",\\\"systemUUID\\\":\\\"2aa69308-6450-4bec-8579-2da85b0e580a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:35 crc kubenswrapper[4713]: E0308 00:08:35.826204 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.540388 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:36 crc kubenswrapper[4713]: E0308 00:08:36.540597 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.563555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfed0950-276b-4126-a600-1031513708f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea7e2638bea2767584ec8289d15911e98d3f0a7ae48a032b89b4466bd807e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bd9c48a8ffb3ecc96d21e191df7975812e597dc665a5487517ab278f89515cc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 00:06:12.289983 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 00:06:12.291350 1 observer_polling.go:159] Starting file observer\\\\nI0308 00:06:12.292878 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 00:06:12.293790 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 00:06:41.970411 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 00:06:41.970630 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:06:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b67e28c29833077f4c11144409783e14d6a3b1875012c1e86c576cae0b38e46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59d7811343e8c519ce7d8d96d1ef70f2cecb384c1fe32fcee17e814e5abb99b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.583123 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.599912 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b0db69397d8d463dff465799530f84d973a3a1ce1c2f9a9d430ebc5878b569d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4403ef69407710862bce3409b3e809a0b850fe503fe870755ea950f82bbbd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.618012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.637000 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e00ab6f0266491d7bda1ff74f8e48f615fe0d9130686ddaeee53be7061720c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlmxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4kr8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: E0308 00:08:36.651196 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.652643 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d9bpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23406c9e-4ba0-4b59-a360-fb325a1adb0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cb4bca06368c64f2c934d25d6a042309b63c037569507504652af7126e51352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r7qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d9bpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.679874 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4673fe5-8264-4062-b008-d6a1b693d334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d6ae8d4290e533f6ba19b5059787e0786d942993db3d185ea64ff166239b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb73d557f39270843a4882d397a6c91a68bd4dc6a9e6970cb9d2e6658c0ad2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d630335e96c320ec67ae449db03f60cf86fb0fc019130b805be32eff8cd7c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97d991d7ec8d9ad3484d6ad22afde51389da0444f80191e07770cab3fdae8857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6914df926e52fb5e19df69ae12dfd41ee0eb86cc9253c87c510234883988cff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42569baac8bbedf33b2c7c14564468fb92e1833ce535601e9e1f371748f5d4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b42aa767a4ed2500a16a0d026667aa4356e25476508285b977b3468ce7fba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://140d80b691e66304f0405c80d1f9089a1cb60e7691e00f6e6b9bd8fddcb74591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.698729 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160301c9-6c5f-40f1-a40f-a0498b367a6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T00:06:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 00:06:53.192348 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 00:06:53.192481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 00:06:53.193151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2587190523/tls.crt::/tmp/serving-cert-2587190523/tls.key\\\\\\\"\\\\nI0308 00:06:53.580580 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 00:06:53.583156 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 00:06:53.583177 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 00:06:53.583197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 00:06:53.583202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 00:06:53.590718 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 00:06:53.590745 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 00:06:53.590754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 00:06:53.590757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 00:06:53.590760 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 00:06:53.590763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 00:06:53.590965 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 00:06:53.592231 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:06:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.715412 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f22c2d7-0e3d-4132-b548-87e98062c766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f9429f468fa364a9888992c1fc62dff1b17294ce018fee40d6bc63ebee8c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486f1bf6be2e719226620d95e54e8e22a36b59998eb9cac6154f86fc5675234c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.740590 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56fbba07-87e8-4e77-b834-ed68af718d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:32Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 00:08:32.539634 7335 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0308 00:08:32.539681 7335 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0308 00:08:32.539725 7335 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0308 00:08:32.539812 7335 factory.go:1336] Added *v1.Node event handler 7\\\\nI0308 00:08:32.539895 7335 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0308 00:08:32.540296 7335 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 00:08:32.540403 7335 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 00:08:32.542051 7335 ovnkube.go:599] Stopped ovnkube\\\\nI0308 00:08:32.542107 7335 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 00:08:32.542214 7335 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:08:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zl27z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gsfft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.756985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"773e859d-0b8b-4dd0-87d1-2987e2092881\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a361c383172f4481b046398c6a434f347b26cf18a9b0c2d77652114eb089de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d658364e9c1f5f65d5e924ee33045fcbbd5d465c9efbf86c8f03dfcf5dc36675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.806323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c32afd26406974393efb534a59b5011df86ecf45cde4f0eadefcf2e41f9b3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.822610 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fh96f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf95e3f7-808b-434f-8fd4-c7e7365a1561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T00:08:21Z\\\",\\\"message\\\":\\\"2026-03-08T00:07:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14\\\\n2026-03-08T00:07:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3990e67c-099f-4787-bb76-e8e8b28a5f14 to /host/opt/cni/bin/\\\\n2026-03-08T00:07:36Z [verbose] multus-daemon started\\\\n2026-03-08T00:07:36Z [verbose] Readiness Indicator file check\\\\n2026-03-08T00:08:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bv9p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fh96f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.838748 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9klvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02de296b-0485-4f21-abf9-51043545b565\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp2sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9klvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.854765 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://703927e61274693e44221ee9ebeb695ef30bacae0734a01c16208d1eb045a46b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.867217 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fp2h2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34185fa0-b348-45e6-990e-4bb01410d564\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edb310b4f3ac2e8beb6797e886d2cbde80960234f1d76878e962ccf2655c9fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk47b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fp2h2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.882683 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-54zzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dbbe8c-4ae1-4a6b-9b62-eac6a5c73205\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c115813ca65a75182e98392a478d8ec65275423377bc44b2d31f640d1677ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6747c06d0458b80ad0377b15559ae88f45a816082c0384ca8e2954dacd52425\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e113cfbffe78b563db26f9d9faa41bc5890236cca73c40a14473720a3b4f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4745632daf717eba89f39c3958568dde61deb4eef0aa28bc41da20861b20b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f3e7adeff04c8f4e7d693e614bdf266c0955a98d565a7217dda0ea60c980625\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b82b6d2c8b485bc7b42e7571dab1b01f36bca08e82f1ce8d527810c6c027aee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3167049a252757a48b4ec9422d4abb9a5cc223e435b88ab32c2fd1d3552ef208\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-928t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:07:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-54zzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.896413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18d4c436-d96e-4238-a331-e31bbba3ef13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4f2e2a2032fc81a42fc85a39850f466a62c05bac6854649c6f1cf4cd351d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be2a9168107359e36f3374d00388edf302f4f04e75b6341365adc72fa8fc5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd7083511dc3876b161d2a5d4bdb150add9f6dac94659eb413736834dbdf0e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T00:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8ae56bd56be8a30fd3029370411e72ff83d64b3476cf80e2c5ec9323bc8be6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T00:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T00:05:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T00:05:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:36 crc kubenswrapper[4713]: I0308 00:08:36.914205 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540106 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540355 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:37 crc kubenswrapper[4713]: I0308 00:08:37.540444 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540486 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540606 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:37 crc kubenswrapper[4713]: E0308 00:08:37.540885 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:38 crc kubenswrapper[4713]: I0308 00:08:38.540299 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:38 crc kubenswrapper[4713]: E0308 00:08:38.540523 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.540619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.540622 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.540776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.540956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:39 crc kubenswrapper[4713]: I0308 00:08:39.542004 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:39 crc kubenswrapper[4713]: E0308 00:08:39.542281 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:40 crc kubenswrapper[4713]: I0308 00:08:40.541016 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:40 crc kubenswrapper[4713]: E0308 00:08:40.542109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540368 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540480 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:41 crc kubenswrapper[4713]: I0308 00:08:41.540502 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.540683 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:41 crc kubenswrapper[4713]: E0308 00:08:41.652645 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:42 crc kubenswrapper[4713]: I0308 00:08:42.540493 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:42 crc kubenswrapper[4713]: E0308 00:08:42.540688 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540941 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540971 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:43 crc kubenswrapper[4713]: I0308 00:08:43.540945 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541187 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541285 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:43 crc kubenswrapper[4713]: E0308 00:08:43.541421 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:44 crc kubenswrapper[4713]: I0308 00:08:44.540973 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:44 crc kubenswrapper[4713]: E0308 00:08:44.541162 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540025 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.540037 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540215 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540364 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:45 crc kubenswrapper[4713]: E0308 00:08:45.540491 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939865 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 00:08:45 crc kubenswrapper[4713]: I0308 00:08:45.939899 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T00:08:45Z","lastTransitionTime":"2026-03-08T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.017984 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j"] Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.018524 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.020419 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.020668 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.021234 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.022231 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.052925 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.052903051 podStartE2EDuration="16.052903051s" podCreationTimestamp="2026-03-08 00:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.035637328 +0000 UTC m=+180.155269571" watchObservedRunningTime="2026-03-08 00:08:46.052903051 +0000 UTC m=+180.172535284" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.124803 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fh96f" podStartSLOduration=108.124786588 podStartE2EDuration="1m48.124786588s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.111157075 +0000 UTC m=+180.230789308" watchObservedRunningTime="2026-03-08 00:08:46.124786588 +0000 UTC m=+180.244418821" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.139540 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.139527628 podStartE2EDuration="42.139527628s" podCreationTimestamp="2026-03-08 00:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.139482137 +0000 UTC m=+180.259114410" watchObservedRunningTime="2026-03-08 00:08:46.139527628 +0000 UTC m=+180.259159861" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163081 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163182 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163220 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163260 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.163474 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.193949 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fp2h2" podStartSLOduration=109.193930355 podStartE2EDuration="1m49.193930355s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.193625657 +0000 UTC m=+180.313257900" watchObservedRunningTime="2026-03-08 00:08:46.193930355 +0000 UTC m=+180.313562588" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.218939 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-54zzt" podStartSLOduration=108.218914693 podStartE2EDuration="1m48.218914693s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.218238086 +0000 UTC m=+180.337870329" watchObservedRunningTime="2026-03-08 00:08:46.218914693 +0000 UTC m=+180.338546966" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.258318 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podStartSLOduration=109.258291482 podStartE2EDuration="1m49.258291482s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.248192358 +0000 UTC m=+180.367824621" watchObservedRunningTime="2026-03-08 00:08:46.258291482 +0000 UTC m=+180.377923745" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.258728 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d9bpk" podStartSLOduration=109.258719143 podStartE2EDuration="1m49.258719143s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.258651011 +0000 UTC m=+180.378283254" watchObservedRunningTime="2026-03-08 00:08:46.258719143 +0000 UTC m=+180.378351416" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264399 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264569 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264775 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265026 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.264907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/83f7b3f3-83a6-447a-8858-960ae6c3006f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265194 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.265440 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.267640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83f7b3f3-83a6-447a-8858-960ae6c3006f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.270785 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f7b3f3-83a6-447a-8858-960ae6c3006f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.286559 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83f7b3f3-83a6-447a-8858-960ae6c3006f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zrg6j\" (UID: \"83f7b3f3-83a6-447a-8858-960ae6c3006f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.311985 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.311956211 podStartE2EDuration="1m25.311956211s" podCreationTimestamp="2026-03-08 00:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.311417767 +0000 UTC m=+180.431050020" watchObservedRunningTime="2026-03-08 00:08:46.311956211 +0000 UTC m=+180.431588484" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.312568 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.312551006 podStartE2EDuration="1m7.312551006s" podCreationTimestamp="2026-03-08 00:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.291729042 +0000 UTC m=+180.411361285" watchObservedRunningTime="2026-03-08 00:08:46.312551006 +0000 UTC m=+180.432183279" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.324296 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=20.32427949 podStartE2EDuration="20.32427949s" podCreationTimestamp="2026-03-08 00:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.323155552 +0000 UTC m=+180.442787805" watchObservedRunningTime="2026-03-08 00:08:46.32427949 +0000 UTC m=+180.443911743" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.337101 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" Mar 08 00:08:46 crc kubenswrapper[4713]: W0308 00:08:46.352606 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f7b3f3_83a6_447a_8858_960ae6c3006f.slice/crio-ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8 WatchSource:0}: Error finding container ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8: Status 404 returned error can't find the container with id ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8 Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.392185 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2j6r" podStartSLOduration=108.392164636 podStartE2EDuration="1m48.392164636s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:46.376985375 +0000 UTC m=+180.496617598" watchObservedRunningTime="2026-03-08 00:08:46.392164636 +0000 UTC m=+180.511796869" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.540632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:46 crc kubenswrapper[4713]: E0308 00:08:46.542586 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.586798 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 00:08:46 crc kubenswrapper[4713]: I0308 00:08:46.596393 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 00:08:46 crc kubenswrapper[4713]: E0308 00:08:46.654315 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.301279 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" event={"ID":"83f7b3f3-83a6-447a-8858-960ae6c3006f","Type":"ContainerStarted","Data":"802fa8b46c29d33985b26a594c4dd0ef927c5969db901ae90aff984c19581262"} Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.301335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" event={"ID":"83f7b3f3-83a6-447a-8858-960ae6c3006f","Type":"ContainerStarted","Data":"ef2a9f0ecf98ad897fbf467736af398c43d9c2440e4ead7712886762f90557c8"} Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.324601 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zrg6j" podStartSLOduration=109.324566693 podStartE2EDuration="1m49.324566693s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:08:47.323419194 +0000 UTC m=+181.443051437" watchObservedRunningTime="2026-03-08 00:08:47.324566693 +0000 UTC m=+181.444198956" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540755 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.540938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541280 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.541908 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:47 crc kubenswrapper[4713]: I0308 00:08:47.542638 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:47 crc kubenswrapper[4713]: E0308 00:08:47.542977 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:08:48 crc kubenswrapper[4713]: I0308 00:08:48.540959 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:48 crc kubenswrapper[4713]: E0308 00:08:48.541171 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540753 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540870 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:49 crc kubenswrapper[4713]: I0308 00:08:49.540789 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541008 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541136 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:49 crc kubenswrapper[4713]: E0308 00:08:49.541287 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:50 crc kubenswrapper[4713]: I0308 00:08:50.540931 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:50 crc kubenswrapper[4713]: E0308 00:08:50.541130 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.323524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.323720 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.323859 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs podName:02de296b-0485-4f21-abf9-51043545b565 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:55.32380577 +0000 UTC m=+249.443438043 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs") pod "network-metrics-daemon-9klvz" (UID: "02de296b-0485-4f21-abf9-51043545b565") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540327 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540390 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:51 crc kubenswrapper[4713]: I0308 00:08:51.540558 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540702 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540867 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.540983 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:51 crc kubenswrapper[4713]: E0308 00:08:51.655885 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:52 crc kubenswrapper[4713]: I0308 00:08:52.540651 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:52 crc kubenswrapper[4713]: E0308 00:08:52.540919 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540547 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:53 crc kubenswrapper[4713]: I0308 00:08:53.540691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.540857 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.541017 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:53 crc kubenswrapper[4713]: E0308 00:08:53.541151 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:54 crc kubenswrapper[4713]: I0308 00:08:54.540552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:54 crc kubenswrapper[4713]: E0308 00:08:54.540747 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.540134 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.540270 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.540377 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.540793 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:55 crc kubenswrapper[4713]: I0308 00:08:55.541029 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:55 crc kubenswrapper[4713]: E0308 00:08:55.541208 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:56 crc kubenswrapper[4713]: I0308 00:08:56.539984 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:56 crc kubenswrapper[4713]: E0308 00:08:56.543622 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:56 crc kubenswrapper[4713]: E0308 00:08:56.657674 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540022 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540088 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:57 crc kubenswrapper[4713]: I0308 00:08:57.540260 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.540960 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.541068 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:57 crc kubenswrapper[4713]: E0308 00:08:57.541571 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:58 crc kubenswrapper[4713]: I0308 00:08:58.540113 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:08:58 crc kubenswrapper[4713]: E0308 00:08:58.540427 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540588 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.540769 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.540771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.541455 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.541699 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:08:59 crc kubenswrapper[4713]: I0308 00:08:59.541765 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:08:59 crc kubenswrapper[4713]: E0308 00:08:59.542052 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:09:00 crc kubenswrapper[4713]: I0308 00:09:00.540413 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:00 crc kubenswrapper[4713]: E0308 00:09:00.540625 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540633 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.540795 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540634 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:01 crc kubenswrapper[4713]: I0308 00:09:01.540887 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.541303 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.541509 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:01 crc kubenswrapper[4713]: E0308 00:09:01.658430 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:02 crc kubenswrapper[4713]: I0308 00:09:02.540612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:02 crc kubenswrapper[4713]: E0308 00:09:02.540759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.539964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.540006 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:03 crc kubenswrapper[4713]: I0308 00:09:03.540415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.541462 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.541922 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:03 crc kubenswrapper[4713]: E0308 00:09:03.542112 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:04 crc kubenswrapper[4713]: I0308 00:09:04.540208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:04 crc kubenswrapper[4713]: E0308 00:09:04.540439 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540684 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.540740 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:05 crc kubenswrapper[4713]: I0308 00:09:05.540582 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.540967 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:05 crc kubenswrapper[4713]: E0308 00:09:05.541069 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:06 crc kubenswrapper[4713]: I0308 00:09:06.540889 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:06 crc kubenswrapper[4713]: E0308 00:09:06.542143 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:06 crc kubenswrapper[4713]: E0308 00:09:06.659928 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540591 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540695 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:07 crc kubenswrapper[4713]: I0308 00:09:07.540732 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.540804 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.541051 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:07 crc kubenswrapper[4713]: E0308 00:09:07.541309 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.370474 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371020 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/0.log" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371058 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" exitCode=1 Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f"} Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371140 4713 scope.go:117] "RemoveContainer" containerID="f5c58b5b388d3e61afef270fcd374b4ca34aca8faaa5d56d4bf1244674af7ea2" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.371580 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:09:08 crc kubenswrapper[4713]: E0308 00:09:08.371927 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:09:08 crc kubenswrapper[4713]: I0308 00:09:08.540171 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:08 crc kubenswrapper[4713]: E0308 00:09:08.540296 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.374632 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540319 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540450 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:09 crc kubenswrapper[4713]: I0308 00:09:09.540513 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540533 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540579 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:09 crc kubenswrapper[4713]: E0308 00:09:09.540658 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:10 crc kubenswrapper[4713]: I0308 00:09:10.541055 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:10 crc kubenswrapper[4713]: E0308 00:09:10.541195 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.540959 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.541018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.541090 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.541192 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.541958 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.542118 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:11 crc kubenswrapper[4713]: I0308 00:09:11.542559 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.542869 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gsfft_openshift-ovn-kubernetes(56fbba07-87e8-4e77-b834-ed68af718d11)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" Mar 08 00:09:11 crc kubenswrapper[4713]: E0308 00:09:11.661170 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:12 crc kubenswrapper[4713]: I0308 00:09:12.540804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:12 crc kubenswrapper[4713]: E0308 00:09:12.541315 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540635 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540677 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:13 crc kubenswrapper[4713]: I0308 00:09:13.540714 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540887 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:13 crc kubenswrapper[4713]: E0308 00:09:13.540987 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:14 crc kubenswrapper[4713]: I0308 00:09:14.541051 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:14 crc kubenswrapper[4713]: E0308 00:09:14.541169 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588053 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588080 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:15 crc kubenswrapper[4713]: I0308 00:09:15.588142 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588269 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588566 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:15 crc kubenswrapper[4713]: E0308 00:09:15.588755 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:16 crc kubenswrapper[4713]: I0308 00:09:16.540042 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:16 crc kubenswrapper[4713]: E0308 00:09:16.541502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.503816 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.540488 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.540767 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:17 crc kubenswrapper[4713]: I0308 00:09:17.540939 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:17 crc kubenswrapper[4713]: E0308 00:09:17.541018 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:18 crc kubenswrapper[4713]: I0308 00:09:18.541060 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:18 crc kubenswrapper[4713]: E0308 00:09:18.541217 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.540638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.540638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.540872 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:19 crc kubenswrapper[4713]: I0308 00:09:19.541105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.541114 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:19 crc kubenswrapper[4713]: E0308 00:09:19.541176 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:20 crc kubenswrapper[4713]: I0308 00:09:20.540466 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:20 crc kubenswrapper[4713]: I0308 00:09:20.540897 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:09:20 crc kubenswrapper[4713]: E0308 00:09:20.540921 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.519436 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.519489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222"} Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540560 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540574 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:21 crc kubenswrapper[4713]: I0308 00:09:21.540680 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.540778 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.540914 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:21 crc kubenswrapper[4713]: E0308 00:09:21.541057 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:22 crc kubenswrapper[4713]: E0308 00:09:22.505443 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 00:09:22 crc kubenswrapper[4713]: I0308 00:09:22.540573 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:22 crc kubenswrapper[4713]: E0308 00:09:22.540697 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540693 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540756 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.540814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541316 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:23 crc kubenswrapper[4713]: I0308 00:09:23.541340 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:09:23 crc kubenswrapper[4713]: E0308 00:09:23.541387 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.310050 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.536532 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.539266 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:24 crc kubenswrapper[4713]: E0308 00:09:24.539638 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.539898 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerStarted","Data":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.543339 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.543393 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:24 crc kubenswrapper[4713]: E0308 00:09:24.543459 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:24 crc kubenswrapper[4713]: I0308 00:09:24.566496 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podStartSLOduration=146.566479527 podStartE2EDuration="2m26.566479527s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:24.566017255 +0000 UTC m=+218.685649518" watchObservedRunningTime="2026-03-08 00:09:24.566479527 +0000 UTC m=+218.686111760" Mar 08 00:09:25 crc kubenswrapper[4713]: I0308 00:09:25.540454 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:25 crc kubenswrapper[4713]: I0308 00:09:25.540478 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:25 crc kubenswrapper[4713]: E0308 00:09:25.540615 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 00:09:25 crc kubenswrapper[4713]: E0308 00:09:25.540757 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 00:09:26 crc kubenswrapper[4713]: I0308 00:09:26.541094 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:26 crc kubenswrapper[4713]: E0308 00:09:26.543265 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9klvz" podUID="02de296b-0485-4f21-abf9-51043545b565" Mar 08 00:09:26 crc kubenswrapper[4713]: I0308 00:09:26.543652 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:26 crc kubenswrapper[4713]: E0308 00:09:26.543803 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.471108 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.471238 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.471218072 +0000 UTC m=+343.590850305 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.540085 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.540131 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.542277 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.542439 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571556 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.571606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.571676 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.571721 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.571708017 +0000 UTC m=+343.691340250 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.572046 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: E0308 00:09:27.572085 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 00:11:29.572073987 +0000 UTC m=+343.691706220 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.577578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.577677 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.855285 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.863792 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.887883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.930347 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931065 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931529 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.931988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.934664 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.934665 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935690 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935811 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.935737 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.936589 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.936982 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.937131 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.937713 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.939768 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.940369 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.940807 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.941121 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.941511 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.942179 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.945220 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.945614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.991298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.991540 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992127 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992375 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992383 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992410 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:09:27 crc kubenswrapper[4713]: I0308 00:09:27.992470 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:27.999961 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.000539 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.008642 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.008921 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009077 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009248 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009663 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.009785 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010061 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010190 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010303 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.010429 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.029321 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.029694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030425 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030533 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030571 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030687 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030790 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.030929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.031084 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034127 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034347 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034418 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034596 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034861 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034361 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.041956 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044710 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044729 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.044953 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045002 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045071 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045267 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045365 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.034396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042464 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042578 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042695 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.042743 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.043013 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.045691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.049988 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053332 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053671 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.053744 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.063561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.065020 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.066458 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.067850 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.071977 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.072907 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079617 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079653 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079673 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079725 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079778 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079797 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.079926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111183 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111400 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.111659 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112231 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.085588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112915 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112941 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.112988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113006 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113118 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113180 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113223 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113242 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113264 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113282 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113304 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113364 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113385 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113432 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113500 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.113872 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.114907 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.115289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.115475 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.116807 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.117012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.117756 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.119588 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120161 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120452 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.120917 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.121030 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.124072 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125559 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.125988 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.126203 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.126355 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127651 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127750 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127847 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.127961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.128030 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.129711 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.130646 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131173 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.131335 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.132411 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133161 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133330 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-drs4q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133909 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.133989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.134180 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.152881 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.153522 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.153858 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154090 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.154519 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.155813 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.157930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.157956 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.159476 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.158468 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.158718 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.160412 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.164116 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.164950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165267 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165394 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.165794 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.166098 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.178715 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.179182 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184409 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184744 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.184747 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.187675 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.197449 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.197997 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.200062 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202296 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202371 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202923 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.202854 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203315 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203733 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.203984 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204171 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204376 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204627 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.204804 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205412 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205580 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205599 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205612 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.205839 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.206586 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.212530 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214093 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214469 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214590 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.214608 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215024 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215917 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215946 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215969 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.215987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216068 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216106 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.216151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217248 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217265 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217283 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217305 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217329 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217492 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217526 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217639 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217664 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.217876 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-audit-dir\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218517 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-audit-policies\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218888 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.218987 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219002 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219043 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c61cbc0b-441e-4704-accf-35963b3758aa-audit-dir\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.219888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.219917 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.719905564 +0000 UTC m=+222.839537797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220024 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220079 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bfa92863-23f8-42d4-8e73-433bf546d304-node-pullsecrets\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220129 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-audit\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220377 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220488 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220574 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.220647 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221137 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221170 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.221505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222107 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-images\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222307 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222325 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.222521 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223139 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c61cbc0b-441e-4704-accf-35963b3758aa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223204 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223938 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-serving-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.223981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.224028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.224096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.225338 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-service-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-encryption-config\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-etcd-client\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226294 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226318 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-etcd-client\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226433 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226516 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.226981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227006 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227037 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227061 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227098 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-trusted-ca-bundle\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.227819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bfa92863-23f8-42d4-8e73-433bf546d304-image-import-ca\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-config\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.228855 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6893b56-2395-4f91-9349-c23b48b957c8-config\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229095 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10940629-a0dc-4828-a913-20a754f4896b-serving-cert\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.229476 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.230019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10940629-a0dc-4828-a913-20a754f4896b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.231179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232431 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233247 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-serving-cert\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232762 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c61cbc0b-441e-4704-accf-35963b3758aa-serving-cert\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232868 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.232560 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bfa92863-23f8-42d4-8e73-433bf546d304-encryption-config\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.233949 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.234588 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.234891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.235930 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.236562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.237999 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.238481 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.240011 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.241876 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.242310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6893b56-2395-4f91-9349-c23b48b957c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.242886 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.245495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.246735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.249226 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.252073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.255450 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.260451 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.262369 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.268109 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.270476 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.272455 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.274741 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.277583 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.280060 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.282538 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.289131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.292074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.292370 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.294183 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.294288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.295027 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.296209 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.297748 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.298618 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.298910 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.299604 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.299762 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sxbdk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.301126 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.301343 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.302638 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.303865 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.305162 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.306341 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.307493 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.308674 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.309907 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.310995 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.312077 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.312914 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.313177 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.314408 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.315747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.316956 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.318234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.319555 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.320685 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.323637 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.323690 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.326236 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328137 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.328327 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.828302728 +0000 UTC m=+222.947934961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328383 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328427 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328495 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328533 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328567 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328730 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328795 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328846 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328872 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.328984 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329010 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329075 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329163 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329188 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329225 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329299 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329322 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329376 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329399 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329422 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-oauth-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329523 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329550 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329615 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329636 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329656 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329706 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329724 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329789 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329844 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329864 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329903 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329939 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329975 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.329994 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330066 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330088 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330234 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330584 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330319 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69b6d0bc-e512-432d-9a6f-f79318c0f571-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330855 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.330879 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331212 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331272 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331303 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.331464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332001 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332050 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332138 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.332170 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.832138434 +0000 UTC m=+222.951770737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332207 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332225 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332358 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332466 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332599 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332673 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332696 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332811 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332875 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.332973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333104 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333489 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.333910 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/452f8fcb-d31f-41d4-be85-d041d7efc756-serving-cert\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.334704 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-service-ca\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335214 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00793875-21cf-4a6e-8da2-2d94bd3725c4-serving-cert\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335774 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.335986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-serving-cert\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336136 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-trusted-ca\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69b6d0bc-e512-432d-9a6f-f79318c0f571-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-config\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336834 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336895 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/452f8fcb-d31f-41d4-be85-d041d7efc756-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.336905 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d068555-56f2-4bcf-8b4c-cc574ad087fa-trusted-ca-bundle\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337442 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.337543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00793875-21cf-4a6e-8da2-2d94bd3725c4-config\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.338920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d068555-56f2-4bcf-8b4c-cc574ad087fa-console-oauth-config\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.339545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.353074 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.373817 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.393927 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.413597 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.433523 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434404 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434761 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.434788 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.435166 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.935119812 +0000 UTC m=+223.054752045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435250 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435340 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435372 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435420 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435549 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435593 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435763 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435884 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435904 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.435951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436063 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436081 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436174 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436202 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.436427 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.438146 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439065 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439090 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-auth-proxy-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439424 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439450 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439738 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548e19ee-14eb-4075-b9e3-69178800837c-service-ca-bundle\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.439816 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:28.93979933 +0000 UTC m=+223.059431563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.439984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440429 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440542 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.440589 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.442537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c9f8ace1-247f-4128-b3f7-95037fb1a156-machine-approver-tls\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.445345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-stats-auth\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447071 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447262 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447336 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1cb07-55b6-4220-989e-13415c3156b2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447541 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-default-certificate\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447563 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447612 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447675 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447740 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447886 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447911 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.447938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448040 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448061 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448107 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448111 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448190 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448306 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448375 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448400 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448442 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448520 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448598 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f8ace1-247f-4128-b3f7-95037fb1a156-config\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448669 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.448800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1cb07-55b6-4220-989e-13415c3156b2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.450627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.451156 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3811a82-b0fe-4e06-948a-79cbbc840a98-trusted-ca\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.451368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e21b584-0781-4fa9-8811-332d42755c17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453161 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453285 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3811a82-b0fe-4e06-948a-79cbbc840a98-metrics-tls\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.453503 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/548e19ee-14eb-4075-b9e3-69178800837c-metrics-certs\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.454244 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.454652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.456046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.459276 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.474169 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.494047 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.514055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.533116 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.540048 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.540272 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.542528 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbf7b38-8980-49e5-956c-08e443912846-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.551295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.551790 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.051447485 +0000 UTC m=+223.171079728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552284 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552612 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552755 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-plugins-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552794 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-csi-data-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552913 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.552973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-tmpfs\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553393 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553437 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-mountpoint-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553477 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-socket-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/063a79dd-fbe8-4562-98bc-deb309b25182-registration-dir\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.553725 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.554147 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.054137363 +0000 UTC m=+223.173769596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556576 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e5b834fc84e3d300046cd1fdbffb156a0e873fcbfbfe0a7c813e27e35445753c"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556680 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"435cc4d28c45eb6127b40eadb5213f6a7bded3488d572996cdfa93f02b79b622"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.556920 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559199 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"644fcf93fd59fdbb47c6c87645c6873caee77e45d0017c72c213bddca9a014ef"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559250 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"433a99e1791d6106056165c414a7ac15d22ecdfc4eef2654050d166efe18a4ff"} Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.559408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dbf7b38-8980-49e5-956c-08e443912846-config\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.573281 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.584155 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccf0e825-0465-40ae-b0ca-f4f7c377e518-metrics-tls\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.593240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.613424 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.633511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.653965 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.654612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.654855 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.154813792 +0000 UTC m=+223.274446015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.655289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.655844 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.155813977 +0000 UTC m=+223.275446210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.673372 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.684410 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/141fc694-b9ce-4b84-9e39-0e79a487e398-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.693898 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.700685 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/141fc694-b9ce-4b84-9e39-0e79a487e398-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.713579 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.753385 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.756201 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.756333 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.256316023 +0000 UTC m=+223.375948266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.756509 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.756841 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.256817645 +0000 UTC m=+223.376449878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.773597 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.792910 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.801814 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.812756 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.819723 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.832916 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.853610 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.858134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.858230 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.358211373 +0000 UTC m=+223.477843606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.858377 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.858698 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.358688855 +0000 UTC m=+223.478321088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.872806 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.885487 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a74e1e8-3928-4220-b55d-ee42585ef1ee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.892466 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.912814 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.916601 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/496a4fbf-c338-4b64-96a5-dda456094c28-images\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.933461 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.953289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.960335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.960404 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.46038662 +0000 UTC m=+223.580018853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.960958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:28 crc kubenswrapper[4713]: E0308 00:09:28.961303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.461292313 +0000 UTC m=+223.580924546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.962267 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/496a4fbf-c338-4b64-96a5-dda456094c28-proxy-tls\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.973652 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.983602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:28 crc kubenswrapper[4713]: I0308 00:09:28.994026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.012801 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.033397 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.053706 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.061749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.061963 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.561939132 +0000 UTC m=+223.681571375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.062057 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.062502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.562484086 +0000 UTC m=+223.682116319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.073454 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.083988 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.093462 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.113264 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.132961 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.139475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-srv-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.152941 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.163532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.163633 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.663609147 +0000 UTC m=+223.783241370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.163731 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.164057 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.664049218 +0000 UTC m=+223.783681451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.172999 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.181462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5eb834dd-5358-45c4-bbca-50baf0e8656b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.183882 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.192867 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.211903 4713 request.go:700] Waited for 1.005869394s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.213750 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.221533 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e21b584-0781-4fa9-8811-332d42755c17-proxy-tls\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.233175 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.252813 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.262138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-service-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.265251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.265401 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.765368034 +0000 UTC m=+223.885000277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.265852 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.266384 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.766357848 +0000 UTC m=+223.885990091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.273284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.278092 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-client\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.294359 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.313585 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.322910 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-serving-cert\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.333214 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.352951 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.366563 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.366693 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.866677349 +0000 UTC m=+223.986309582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.366890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.367227 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.867219623 +0000 UTC m=+223.986851856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.373675 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.380162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-config\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.393789 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.401448 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-etcd-ca\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.413167 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.433775 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.440140 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.440254 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert podName:d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.940220727 +0000 UTC m=+224.059852990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-h5mxt" (UID: "d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.447961 4713 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.448164 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca podName:9e570b68-8b4c-42e3-839d-f37943999246 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.948136146 +0000 UTC m=+224.067768409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca") pod "marketplace-operator-79b997595-p9hqz" (UID: "9e570b68-8b4c-42e3-839d-f37943999246") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.450656 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.450689 4713 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.452068 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert podName:0d2f415a-2626-45f9-baf0-68ab25b9d079 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.950752202 +0000 UTC m=+224.070384475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert") pod "olm-operator-6b444d44fb-8m94r" (UID: "0d2f415a-2626-45f9-baf0-68ab25b9d079") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.452123 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics podName:9e570b68-8b4c-42e3-839d-f37943999246 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.952104366 +0000 UTC m=+224.071736639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics") pod "marketplace-operator-79b997595-p9hqz" (UID: "9e570b68-8b4c-42e3-839d-f37943999246") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.454015 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.468299 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.469734 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:29.969703098 +0000 UTC m=+224.089335371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.484396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.493908 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.534434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfj6\" (UniqueName: \"kubernetes.io/projected/c6893b56-2395-4f91-9349-c23b48b957c8-kube-api-access-hjfj6\") pod \"machine-api-operator-5694c8668f-dkkh7\" (UID: \"c6893b56-2395-4f91-9349-c23b48b957c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.547646 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pt9w\" (UniqueName: \"kubernetes.io/projected/10940629-a0dc-4828-a913-20a754f4896b-kube-api-access-7pt9w\") pod \"authentication-operator-69f744f599-fhq98\" (UID: \"10940629-a0dc-4828-a913-20a754f4896b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552503 4713 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552539 4713 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552601 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert podName:899ec382-6c79-460e-9e3c-9dfb25867855 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.0525824 +0000 UTC m=+224.172214633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert") pod "service-ca-operator-777779d784-5bltg" (UID: "899ec382-6c79-460e-9e3c-9dfb25867855") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552716 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert podName:158ba4b3-9da3-4a83-95dd-e625c7b19a2b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.052610791 +0000 UTC m=+224.172243024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert") pod "ingress-canary-xmjhj" (UID: "158ba4b3-9da3-4a83-95dd-e625c7b19a2b") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552793 4713 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.552838 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config podName:899ec382-6c79-460e-9e3c-9dfb25867855 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.052814786 +0000 UTC m=+224.172447019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config") pod "service-ca-operator-777779d784-5bltg" (UID: "899ec382-6c79-460e-9e3c-9dfb25867855") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554162 4713 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554288 4713 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554321 4713 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554338 4713 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554348 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554388 4713 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554292 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token podName:a8c7be2b-608c-4089-b8a6-76bef69c3588 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054251232 +0000 UTC m=+224.173883505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token") pod "machine-config-server-sxbdk" (UID: "a8c7be2b-608c-4089-b8a6-76bef69c3588") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554279 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554416 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs podName:a8c7be2b-608c-4089-b8a6-76bef69c3588 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054405046 +0000 UTC m=+224.174037279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs") pod "machine-config-server-sxbdk" (UID: "a8c7be2b-608c-4089-b8a6-76bef69c3588") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554442 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert podName:3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054424887 +0000 UTC m=+224.174057120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert") pod "packageserver-d55dfcdfc-g99pk" (UID: "3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554463 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle podName:ee63f184-4609-43d4-bdc1-2c840aef6d7f nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054453907 +0000 UTC m=+224.174086240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle") pod "service-ca-9c57cc56f-c4nq5" (UID: "ee63f184-4609-43d4-bdc1-2c840aef6d7f") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554481 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key podName:ee63f184-4609-43d4-bdc1-2c840aef6d7f nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054473078 +0000 UTC m=+224.174105411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key") pod "service-ca-9c57cc56f-c4nq5" (UID: "ee63f184-4609-43d4-bdc1-2c840aef6d7f") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554495 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert podName:3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054487998 +0000 UTC m=+224.174120341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert") pod "packageserver-d55dfcdfc-g99pk" (UID: "3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554326 4713 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554505 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls podName:39da2ba4-aebb-485b-8e46-7ffc36efa490 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054501169 +0000 UTC m=+224.174133402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls") pod "dns-default-lwhnh" (UID: "39da2ba4-aebb-485b-8e46-7ffc36efa490") : failed to sync secret cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.554527 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume podName:39da2ba4-aebb-485b-8e46-7ffc36efa490 nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.054518399 +0000 UTC m=+224.174150742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume") pod "dns-default-lwhnh" (UID: "39da2ba4-aebb-485b-8e46-7ffc36efa490") : failed to sync configmap cache: timed out waiting for the condition Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.554643 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.568573 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4bd\" (UniqueName: \"kubernetes.io/projected/c61cbc0b-441e-4704-accf-35963b3758aa-kube-api-access-tz4bd\") pod \"apiserver-7bbb656c7d-l464l\" (UID: \"c61cbc0b-441e-4704-accf-35963b3758aa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.570459 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.570884 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.07087006 +0000 UTC m=+224.190502293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.588102 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"controller-manager-879f6c89f-4xznw\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.613542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ghw\" (UniqueName: \"kubernetes.io/projected/bfa92863-23f8-42d4-8e73-433bf546d304-kube-api-access-q5ghw\") pod \"apiserver-76f77b778f-58c66\" (UID: \"bfa92863-23f8-42d4-8e73-433bf546d304\") " pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.622265 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.639615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"route-controller-manager-6576b87f9c-7snq7\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.648531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzxf\" (UniqueName: \"kubernetes.io/projected/8e76411a-c4c2-4822-9ec9-a7e73c15f7ec-kube-api-access-sdzxf\") pod \"openshift-apiserver-operator-796bbdcf4f-lg6jl\" (UID: \"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.653243 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.655650 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.671787 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.671978 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.17194864 +0000 UTC m=+224.291580883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.672295 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.672692 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.172680238 +0000 UTC m=+224.292312471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.673774 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.693999 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.713617 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.732775 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.746057 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fhq98"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.753094 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:09:29 crc kubenswrapper[4713]: W0308 00:09:29.763775 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10940629_a0dc_4828_a913_20a754f4896b.slice/crio-2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad WatchSource:0}: Error finding container 2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad: Status 404 returned error can't find the container with id 2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.767203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.773795 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.273755658 +0000 UTC m=+224.393387891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773884 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.773930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.774441 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.274428465 +0000 UTC m=+224.394060698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.793289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.813448 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.820337 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dkkh7"] Mar 08 00:09:29 crc kubenswrapper[4713]: W0308 00:09:29.824873 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6893b56_2395_4f91_9349_c23b48b957c8.slice/crio-29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3 WatchSource:0}: Error finding container 29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3: Status 404 returned error can't find the container with id 29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3 Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.825960 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.833980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.847313 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.855646 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.874740 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.875295 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.375276679 +0000 UTC m=+224.494908912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.876943 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.894237 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.898217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.915134 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.934852 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.935059 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.953083 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.953782 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-58c66"] Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.973564 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.976455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.976698 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.977917 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: E0308 00:09:29.978249 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.478233656 +0000 UTC m=+224.597865889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.979180 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.986568 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d2f415a-2626-45f9-baf0-68ab25b9d079-srv-cert\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.987526 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.993140 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 00:09:29 crc kubenswrapper[4713]: I0308 00:09:29.994248 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.014898 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.021418 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.033606 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.038675 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ba1fb6_83e1_4a29_93a5_5abf00f86718.slice/crio-a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617 WatchSource:0}: Error finding container a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617: Status 404 returned error can't find the container with id a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.054480 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.074704 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.078946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079212 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079406 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079657 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.079784 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.579747936 +0000 UTC m=+224.699380209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079891 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079918 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.079992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da2ba4-aebb-485b-8e46-7ffc36efa490-config-volume\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080901 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.080982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-cabundle\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.081171 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899ec382-6c79-460e-9e3c-9dfb25867855-config\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.082930 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-webhook-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.083219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee63f184-4609-43d4-bdc1-2c840aef6d7f-signing-key\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093801 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39da2ba4-aebb-485b-8e46-7ffc36efa490-metrics-tls\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093896 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.093900 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.095364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-cert\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.109417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/899ec382-6c79-460e-9e3c-9dfb25867855-serving-cert\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.114066 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.133923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.145265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-certs\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.154981 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.168098 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.174923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.184120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.184487 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.684474398 +0000 UTC m=+224.804106631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.188259 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8c7be2b-608c-4089-b8a6-76bef69c3588-node-bootstrap-token\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.207164 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfj7m\" (UniqueName: \"kubernetes.io/projected/452f8fcb-d31f-41d4-be85-d041d7efc756-kube-api-access-mfj7m\") pod \"openshift-config-operator-7777fb866f-k5mg9\" (UID: \"452f8fcb-d31f-41d4-be85-d041d7efc756\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.232452 4713 request.go:700] Waited for 1.902487573s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.234866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"image-pruner-29548800-ghv4d\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.247317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg7d\" (UniqueName: \"kubernetes.io/projected/00793875-21cf-4a6e-8da2-2d94bd3725c4-kube-api-access-hfg7d\") pod \"console-operator-58897d9998-2k6nd\" (UID: \"00793875-21cf-4a6e-8da2-2d94bd3725c4\") " pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.271709 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74nj\" (UniqueName: \"kubernetes.io/projected/62cfca3e-2ad8-4964-bd9a-5f907f09ca1e-kube-api-access-d74nj\") pod \"downloads-7954f5f757-z4s84\" (UID: \"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e\") " pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.286352 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.286681 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.786660144 +0000 UTC m=+224.906292377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.286750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.287132 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.787115296 +0000 UTC m=+224.906747529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.288407 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfkqd\" (UniqueName: \"kubernetes.io/projected/1d068555-56f2-4bcf-8b4c-cc574ad087fa-kube-api-access-nfkqd\") pod \"console-f9d7485db-gk97q\" (UID: \"1d068555-56f2-4bcf-8b4c-cc574ad087fa\") " pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.299445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.310804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.313507 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df45t\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-kube-api-access-df45t\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.319407 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.329386 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69b6d0bc-e512-432d-9a6f-f79318c0f571-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4cd9v\" (UID: \"69b6d0bc-e512-432d-9a6f-f79318c0f571\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.340335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.349301 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.360030 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.369369 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.373674 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.392317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.392841 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.892805461 +0000 UTC m=+225.012437694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.393545 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.416689 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.452251 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f9a6567-ebe5-4ba9-80ab-a2cd48818942-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmgvw\" (UID: \"8f9a6567-ebe5-4ba9-80ab-a2cd48818942\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.490436 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckjk\" (UniqueName: \"kubernetes.io/projected/ccf0e825-0465-40ae-b0ca-f4f7c377e518-kube-api-access-rckjk\") pod \"dns-operator-744455d44c-xr24g\" (UID: \"ccf0e825-0465-40ae-b0ca-f4f7c377e518\") " pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.494182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.494774 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:30.994757943 +0000 UTC m=+225.114390176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.508888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tmn\" (UniqueName: \"kubernetes.io/projected/0d2f415a-2626-45f9-baf0-68ab25b9d079-kube-api-access-l9tmn\") pod \"olm-operator-6b444d44fb-8m94r\" (UID: \"0d2f415a-2626-45f9-baf0-68ab25b9d079\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.527270 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrdb\" (UniqueName: \"kubernetes.io/projected/2be1cb07-55b6-4220-989e-13415c3156b2-kube-api-access-kmrdb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pvc8t\" (UID: \"2be1cb07-55b6-4220-989e-13415c3156b2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.527460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gk97q"] Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.536374 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d068555_56f2_4bcf_8b4c_cc574ad087fa.slice/crio-8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a WatchSource:0}: Error finding container 8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a: Status 404 returned error can't find the container with id 8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.546753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5dw\" (UniqueName: \"kubernetes.io/projected/5eb834dd-5358-45c4-bbca-50baf0e8656b-kube-api-access-wm5dw\") pod \"catalog-operator-68c6474976-bn56j\" (UID: \"5eb834dd-5358-45c4-bbca-50baf0e8656b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.568752 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p77q9\" (UniqueName: \"kubernetes.io/projected/6e21b584-0781-4fa9-8811-332d42755c17-kube-api-access-p77q9\") pod \"machine-config-controller-84d6567774-shncx\" (UID: \"6e21b584-0781-4fa9-8811-332d42755c17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.575755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerStarted","Data":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.575809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerStarted","Data":"4dcd3efc63c2bb82108f5db86db8f7d5ce1c4ffb7c4a91ed149a6c9ab7e1050e"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.576740 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.580492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" event={"ID":"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec","Type":"ContainerStarted","Data":"3983d2aa68f6da8f44569c63ac9c2a782dcda7998ffa916f2360f4db5f684ce4"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.580535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" event={"ID":"8e76411a-c4c2-4822-9ec9-a7e73c15f7ec","Type":"ContainerStarted","Data":"62d2206459d89bf6d737d11946b11561a84e0e852500858d623d83d9845ccafa"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583240 4713 generic.go:334] "Generic (PLEG): container finished" podID="bfa92863-23f8-42d4-8e73-433bf546d304" containerID="8d932a99f7ba16281e1a18006ec4ee445f240f93e3a565e114dcfe8b04d9a720" exitCode=0 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerDied","Data":"8d932a99f7ba16281e1a18006ec4ee445f240f93e3a565e114dcfe8b04d9a720"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"b0ba9787a9b65059ba19235191be65a05b519d232255c85dc8cc1702a1a33dff"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.583975 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.584028 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.589615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6qkt\" (UniqueName: \"kubernetes.io/projected/fd936d68-81ed-4923-8078-5ad0116d532e-kube-api-access-j6qkt\") pod \"migrator-59844c95c7-wld5v\" (UID: \"fd936d68-81ed-4923-8078-5ad0116d532e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.595174 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.595498 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.095473974 +0000 UTC m=+225.215106197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.595579 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.595926 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.095913705 +0000 UTC m=+225.215545938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.601250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604123 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"c3313856c8bd270e779e3471c10a34b6df61acc366568e89bf7663e22bdf4185"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"3a26477e3ba90b535125524bf64cf9ce159f8050230c417111621ff9c77ef8d0"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.604183 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" event={"ID":"c6893b56-2395-4f91-9349-c23b48b957c8","Type":"ContainerStarted","Data":"29a7c4c18a7333fd6b9259f4ff1a952ca8c0aef11eb27d81e32d45184ecd9ba3"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606450 4713 generic.go:334] "Generic (PLEG): container finished" podID="c61cbc0b-441e-4704-accf-35963b3758aa" containerID="ef8e161c1c91b1f5e0788ba38a4581e70a6ba4e4085a7309a178813299b2fd64" exitCode=0 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606514 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerDied","Data":"ef8e161c1c91b1f5e0788ba38a4581e70a6ba4e4085a7309a178813299b2fd64"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.606557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerStarted","Data":"dd9fcb9296a5e60bcd45c21270f92c6b89629e223159d3ecc2eaaf679c9db764"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.607529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gk97q" event={"ID":"1d068555-56f2-4bcf-8b4c-cc574ad087fa","Type":"ContainerStarted","Data":"8ff48c0a58bcc4629742d5c5adc29f9d4e6b0e3c6857275419af98c5e780994a"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.610130 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerStarted","Data":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.610176 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerStarted","Data":"a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.611705 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.612986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" event={"ID":"10940629-a0dc-4828-a913-20a754f4896b","Type":"ContainerStarted","Data":"27db8f6bfae774d8dea6ec16c8c4cdd7826ed457f2c15b6aa7bcd6ca93f36a27"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.613054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" event={"ID":"10940629-a0dc-4828-a913-20a754f4896b","Type":"ContainerStarted","Data":"2d50ddd00ecb585fac16ea196ec00bce2d2c4db3abf5dd9994fc43c3faed8cad"} Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.618457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lcv\" (UniqueName: \"kubernetes.io/projected/c9f8ace1-247f-4128-b3f7-95037fb1a156-kube-api-access-w8lcv\") pod \"machine-approver-56656f9798-tdq97\" (UID: \"c9f8ace1-247f-4128-b3f7-95037fb1a156\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.618645 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621395 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29548800-ghv4d"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621582 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.621634 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.627990 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.628327 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspc4\" (UniqueName: \"kubernetes.io/projected/3a74e1e8-3928-4220-b55d-ee42585ef1ee-kube-api-access-dspc4\") pod \"cluster-samples-operator-665b6dd947-6swxn\" (UID: \"3a74e1e8-3928-4220-b55d-ee42585ef1ee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.628494 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.646898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.648273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.651326 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfnr\" (UniqueName: \"kubernetes.io/projected/9fed4c23-4a16-4502-87eb-d1dd68aa1af5-kube-api-access-qxfnr\") pod \"multus-admission-controller-857f4d67dd-2qwgb\" (UID: \"9fed4c23-4a16-4502-87eb-d1dd68aa1af5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.682485 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.692137 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45bj\" (UniqueName: \"kubernetes.io/projected/0e43994e-0aa1-4541-bce9-502bbc1dc0a0-kube-api-access-g45bj\") pod \"etcd-operator-b45778765-4qpfj\" (UID: \"0e43994e-0aa1-4541-bce9-502bbc1dc0a0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.697993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.698122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.698541 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.198525973 +0000 UTC m=+225.318158206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.710174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"marketplace-operator-79b997595-p9hqz\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.710439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.726264 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2k6nd"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.726816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.734615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wlf\" (UniqueName: \"kubernetes.io/projected/141fc694-b9ce-4b84-9e39-0e79a487e398-kube-api-access-j6wlf\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvsbq\" (UID: \"141fc694-b9ce-4b84-9e39-0e79a487e398\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.751640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tch6h\" (UniqueName: \"kubernetes.io/projected/496a4fbf-c338-4b64-96a5-dda456094c28-kube-api-access-tch6h\") pod \"machine-config-operator-74547568cd-q7bjv\" (UID: \"496a4fbf-c338-4b64-96a5-dda456094c28\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.770878 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"oauth-openshift-558db77b4-c8gbn\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.790922 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27jn\" (UniqueName: \"kubernetes.io/projected/d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4-kube-api-access-k27jn\") pod \"package-server-manager-789f6589d5-h5mxt\" (UID: \"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.795783 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z4s84"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.797719 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.799668 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.800140 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.300119956 +0000 UTC m=+225.419752259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.801717 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00793875_21cf_4a6e_8da2_2d94bd3725c4.slice/crio-f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf WatchSource:0}: Error finding container f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf: Status 404 returned error can't find the container with id f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.819722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d587l\" (UniqueName: \"kubernetes.io/projected/d3811a82-b0fe-4e06-948a-79cbbc840a98-kube-api-access-d587l\") pod \"ingress-operator-5b745b69d9-bltk5\" (UID: \"d3811a82-b0fe-4e06-948a-79cbbc840a98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.820285 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v"] Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.830465 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkds\" (UniqueName: \"kubernetes.io/projected/548e19ee-14eb-4075-b9e3-69178800837c-kube-api-access-wmkds\") pod \"router-default-5444994796-drs4q\" (UID: \"548e19ee-14eb-4075-b9e3-69178800837c\") " pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.836569 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" Mar 08 00:09:30 crc kubenswrapper[4713]: W0308 00:09:30.838380 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62cfca3e_2ad8_4964_bd9a_5f907f09ca1e.slice/crio-d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21 WatchSource:0}: Error finding container d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21: Status 404 returned error can't find the container with id d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21 Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.848786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jhxcl\" (UID: \"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.867791 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.869750 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dbf7b38-8980-49e5-956c-08e443912846-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p529\" (UID: \"0dbf7b38-8980-49e5-956c-08e443912846\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.877152 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.885238 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.892040 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.894570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk4f\" (UniqueName: \"kubernetes.io/projected/f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc-kube-api-access-7nk4f\") pod \"control-plane-machine-set-operator-78cbb6b69f-7wd77\" (UID: \"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.901552 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:30 crc kubenswrapper[4713]: E0308 00:09:30.901984 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.401964625 +0000 UTC m=+225.521596858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.910067 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.917293 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.919465 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"auto-csr-approver-29548808-nd57l\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.932095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.933913 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.940464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.954544 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.961642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.974297 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.977893 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:30 crc kubenswrapper[4713]: I0308 00:09:30.989584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.003860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.004303 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.504288676 +0000 UTC m=+225.623920909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.006294 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-shncx"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.008629 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.013815 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsn7h\" (UniqueName: \"kubernetes.io/projected/a8c7be2b-608c-4089-b8a6-76bef69c3588-kube-api-access-bsn7h\") pod \"machine-config-server-sxbdk\" (UID: \"a8c7be2b-608c-4089-b8a6-76bef69c3588\") " pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.032539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vz2\" (UniqueName: \"kubernetes.io/projected/063a79dd-fbe8-4562-98bc-deb309b25182-kube-api-access-m8vz2\") pod \"csi-hostpathplugin-q84x9\" (UID: \"063a79dd-fbe8-4562-98bc-deb309b25182\") " pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.043523 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xr24g"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.048725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpf9l\" (UniqueName: \"kubernetes.io/projected/ee63f184-4609-43d4-bdc1-2c840aef6d7f-kube-api-access-rpf9l\") pod \"service-ca-9c57cc56f-c4nq5\" (UID: \"ee63f184-4609-43d4-bdc1-2c840aef6d7f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.058768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.073304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkpx\" (UniqueName: \"kubernetes.io/projected/39da2ba4-aebb-485b-8e46-7ffc36efa490-kube-api-access-mmkpx\") pod \"dns-default-lwhnh\" (UID: \"39da2ba4-aebb-485b-8e46-7ffc36efa490\") " pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.076313 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.105211 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.110608 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzk6\" (UniqueName: \"kubernetes.io/projected/158ba4b3-9da3-4a83-95dd-e625c7b19a2b-kube-api-access-lvzk6\") pod \"ingress-canary-xmjhj\" (UID: \"158ba4b3-9da3-4a83-95dd-e625c7b19a2b\") " pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.115601 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmjhj" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.116531 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.116596 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.116727 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.616702211 +0000 UTC m=+225.736334444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.117120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.117502 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sxbdk" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.119897 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.61986959 +0000 UTC m=+225.739501823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.123189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.144406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9m6\" (UniqueName: \"kubernetes.io/projected/3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6-kube-api-access-4x9m6\") pod \"packageserver-d55dfcdfc-g99pk\" (UID: \"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.146794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268pq\" (UniqueName: \"kubernetes.io/projected/899ec382-6c79-460e-9e3c-9dfb25867855-kube-api-access-268pq\") pod \"service-ca-operator-777779d784-5bltg\" (UID: \"899ec382-6c79-460e-9e3c-9dfb25867855\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.193617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218351 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.218682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.218775 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.718760865 +0000 UTC m=+225.838393098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.284151 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322348 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322485 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.322534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.323211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.323475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.323640 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.82362732 +0000 UTC m=+225.943259553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.328463 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.369451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.371031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"collect-profiles-29548800-cclv4\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.390499 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.394463 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.397196 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v"] Mar 08 00:09:31 crc kubenswrapper[4713]: W0308 00:09:31.418231 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod548e19ee_14eb_4075_b9e3_69178800837c.slice/crio-07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9 WatchSource:0}: Error finding container 07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9: Status 404 returned error can't find the container with id 07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9 Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.432064 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.432225 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.932198948 +0000 UTC m=+226.051831181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.432271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.432563 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:31.932550797 +0000 UTC m=+226.052183030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.533608 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.533883 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.033848172 +0000 UTC m=+226.153480395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.534147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.534456 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.034445127 +0000 UTC m=+226.154077360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.535945 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.635235 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.637068 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.637321 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv"] Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.637466 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.137445185 +0000 UTC m=+226.257077418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.657753 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.657957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"2b04c5dba8341e7071b5a25348b24b8fe49f2fa0f49283898e52f444691e4c4d"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.662570 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gk97q" event={"ID":"1d068555-56f2-4bcf-8b4c-cc574ad087fa","Type":"ContainerStarted","Data":"9d8dc8439406027f01ceb6aedaedc6496607794c932cf2f7e302ec056f77213e"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.663195 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.702463 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drs4q" event={"ID":"548e19ee-14eb-4075-b9e3-69178800837c","Type":"ContainerStarted","Data":"07687e468c691ac7ff50057d0bbfea873d5edf04cd2b2be0edb2606e41e054f9"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.723529 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4qpfj"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.726501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" event={"ID":"00793875-21cf-4a6e-8da2-2d94bd3725c4","Type":"ContainerStarted","Data":"4202dd9aed16f2668e430b9808f118d1000f996e9ab98c6807453d6e03386ad7"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.727187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" event={"ID":"00793875-21cf-4a6e-8da2-2d94bd3725c4","Type":"ContainerStarted","Data":"f4a25a1d552f9b27130e4a2325b1c7b384ce6efa15ac9ae4b909274ad89af8cf"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.728426 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.738015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerStarted","Data":"f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.738075 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerStarted","Data":"6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.744164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.744508 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2qwgb"] Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.744898 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.244870325 +0000 UTC m=+226.364502558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.759667 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.759716 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.764660 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.770739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.770775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"7d557f649440aa7d8979e239e3dbc43be1e038a5d177bc7e9b64392203cbedb0"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.790570 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" event={"ID":"0d2f415a-2626-45f9-baf0-68ab25b9d079","Type":"ContainerStarted","Data":"b986c3ea3367f0d8e16ad232b5d65a39e5e8c1b421c8da06daf14ef57c0db285"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.792861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" event={"ID":"5eb834dd-5358-45c4-bbca-50baf0e8656b","Type":"ContainerStarted","Data":"0d44a394195fcb4f42952a22e57cfd1a6b6f0db20a3d3d6af4abfcb58f3829f3"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.801687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" event={"ID":"c61cbc0b-441e-4704-accf-35963b3758aa","Type":"ContainerStarted","Data":"f069cfc486387c5cde34f5afb6cecf83b6fb955230bf1ce769adaf1a981ffba9"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815052 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"d0161141af7255dad686f4f84bc54018c222652a3d7e33ad5ffe56ff73f94e21"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.815582 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.817147 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.817189 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.821201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"d3d0b0a012b975a10283fb4300f9ac3db386cac5a5ffd9a8c67b5efdc2cdbb7b"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.822460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" event={"ID":"69b6d0bc-e512-432d-9a6f-f79318c0f571","Type":"ContainerStarted","Data":"0f516b5e95ed11912f6e66ecaa5c09eef45730e1cf73f99b3a1ddd5f02aad27c"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.823265 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" event={"ID":"8f9a6567-ebe5-4ba9-80ab-a2cd48818942","Type":"ContainerStarted","Data":"1b788d5a6379b1837b939a466d3a77ee20b5cff8b78cf4ee661310e5c54b3d13"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.825453 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c4nq5"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.826024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"cfd46bb74a3a2e4e75cc309902049244faa91690798932d4b2acdf457dc24654"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.830313 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"acfab74d0e5aa0f60ce6b65943323fce0eb8ed34518b52222ec8a0203d809698"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832373 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"aa86b13cd04527b13fb395768d0e88b7a726c753651d2b1f343d3553ca45cc9c"} Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832906 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.832955 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.833076 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.833120 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.845598 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.845762 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.345730939 +0000 UTC m=+226.465363182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.846142 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.846435 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.346426976 +0000 UTC m=+226.466059209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:31 crc kubenswrapper[4713]: W0308 00:09:31.914049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141fc694_b9ce_4b84_9e39_0e79a487e398.slice/crio-a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6 WatchSource:0}: Error finding container a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6: Status 404 returned error can't find the container with id a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6 Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.934867 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.936106 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.941037 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn"] Mar 08 00:09:31 crc kubenswrapper[4713]: I0308 00:09:31.950239 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:31 crc kubenswrapper[4713]: E0308 00:09:31.951673 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.45165271 +0000 UTC m=+226.571284953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.014995 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fhq98" podStartSLOduration=155.014973791 podStartE2EDuration="2m35.014973791s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.01331257 +0000 UTC m=+226.132944803" watchObservedRunningTime="2026-03-08 00:09:32.014973791 +0000 UTC m=+226.134606024" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.040610 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmjhj"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.051172 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dbf7b38_8980_49e5_956c_08e443912846.slice/crio-4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3 WatchSource:0}: Error finding container 4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3: Status 404 returned error can't find the container with id 4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.051850 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.052211 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.552200617 +0000 UTC m=+226.671832850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.073103 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dkkh7" podStartSLOduration=154.073080621 podStartE2EDuration="2m34.073080621s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.068221319 +0000 UTC m=+226.187853552" watchObservedRunningTime="2026-03-08 00:09:32.073080621 +0000 UTC m=+226.192712864" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.074863 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.153006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.153530 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.653497362 +0000 UTC m=+226.773129595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.186214 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158ba4b3_9da3_4a83_95dd_e625c7b19a2b.slice/crio-e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0 WatchSource:0}: Error finding container e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0: Status 404 returned error can't find the container with id e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.187537 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.214797 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lwhnh"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.216118 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3811a82_b0fe_4e06_948a_79cbbc840a98.slice/crio-7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc WatchSource:0}: Error finding container 7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc: Status 404 returned error can't find the container with id 7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.254708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.255452 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39da2ba4_aebb_485b_8e46_7ffc36efa490.slice/crio-b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497 WatchSource:0}: Error finding container b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497: Status 404 returned error can't find the container with id b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497 Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.256248 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.756230883 +0000 UTC m=+226.875863126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.303561 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q84x9"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.312747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.356032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.356454 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.856424631 +0000 UTC m=+226.976056864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.379075 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.409106 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.457945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.458285 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:32.95826781 +0000 UTC m=+227.077900043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.495274 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4"] Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.502705 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod063a79dd_fbe8_4562_98bc_deb309b25182.slice/crio-fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e WatchSource:0}: Error finding container fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e: Status 404 returned error can't find the container with id fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e Mar 08 00:09:32 crc kubenswrapper[4713]: W0308 00:09:32.503219 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdccd72c_79d7_4388_926e_0539c571dafe.slice/crio-0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa WatchSource:0}: Error finding container 0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa: Status 404 returned error can't find the container with id 0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.505460 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.510155 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5bltg"] Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.560152 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.560793 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.060770236 +0000 UTC m=+227.180402469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.645576 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lg6jl" podStartSLOduration=155.645556156 podStartE2EDuration="2m35.645556156s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.642878949 +0000 UTC m=+226.762511202" watchObservedRunningTime="2026-03-08 00:09:32.645556156 +0000 UTC m=+226.765188389" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.661351 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.661749 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.161721222 +0000 UTC m=+227.281353455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.700324 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gk97q" podStartSLOduration=154.700296691 podStartE2EDuration="2m34.700296691s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.69546839 +0000 UTC m=+226.815100623" watchObservedRunningTime="2026-03-08 00:09:32.700296691 +0000 UTC m=+226.819928924" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.762951 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.763160 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.26312556 +0000 UTC m=+227.382757793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.763282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.763580 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.263568581 +0000 UTC m=+227.383200814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.842616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"24fb7b611b2bba7816e13ffd395a56cee4b640ca9e46deb1afb7b067011d4ee1"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.844321 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" event={"ID":"0d2f415a-2626-45f9-baf0-68ab25b9d079","Type":"ContainerStarted","Data":"babe5ff1551993631dbb59509786ee87fc512912b19e1ab02fc1f3a5e61a47dc"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.845361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"fe67e4e82591b9266983190fce32b17f5c4383bc0b4f0ec37160261fdf04da6e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.846837 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" event={"ID":"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6","Type":"ContainerStarted","Data":"9fbc51b29e200e46787490449f1137ed821ea23125402318a6489ea2356fff8e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.848319 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmjhj" event={"ID":"158ba4b3-9da3-4a83-95dd-e625c7b19a2b","Type":"ContainerStarted","Data":"e8a3e872d40d500d6f0874070ede52356ca1f0983fc3d005e18d1ae2ddedd2f0"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.854741 4713 generic.go:334] "Generic (PLEG): container finished" podID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerID="c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1" exitCode=0 Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.854835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerDied","Data":"c14731dbfabd77f2630c53172ea07e30cf12a7520235295ed5978f0dac04e3b1"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.857422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerStarted","Data":"8a2d896d73aedf449a67c5c1becd624d05fd0cc1bac64192c1528302ec9e1810"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.860521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" event={"ID":"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc","Type":"ContainerStarted","Data":"4f735baf03071a713358d5084a3ed1c39a064b786c0c8aab2cec625051e1bf4f"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.864251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.864631 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.3646113 +0000 UTC m=+227.484243533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.868112 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"7df7f4d33d83755772a6cd1dc146a40e86d3bcef9e2facebd3acdd5f7346cddc"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.871790 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"c1cc2bd2761912bed0bce72c583ff4a3ce293060ab546c49da1234cb5b624829"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.873847 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" event={"ID":"ee63f184-4609-43d4-bdc1-2c840aef6d7f","Type":"ContainerStarted","Data":"1403c7f7c82104f1fb2d5acbca121b2f621f34934f6c942ece623278837b82a7"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.875557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" event={"ID":"5eb834dd-5358-45c4-bbca-50baf0e8656b","Type":"ContainerStarted","Data":"7768995058b6d14ec7324fef4fdf9eb4130adf2619a94fd9384329ad45f0dda9"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.876778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" event={"ID":"8f9a6567-ebe5-4ba9-80ab-a2cd48818942","Type":"ContainerStarted","Data":"b8cee89ff59a87f0aef0cba5e55318481207c1684c87c8c0e24a463d0b451164"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.877792 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerStarted","Data":"e0d410e7c38a223bcd0189e0430b8bd6e62ba561f8515070eac1a52a52fdb35d"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.879278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sxbdk" event={"ID":"a8c7be2b-608c-4089-b8a6-76bef69c3588","Type":"ContainerStarted","Data":"1a02a4260c82b95218d95b7ec0f782a08c30d534af39889958bef08ce68a1906"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.879305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sxbdk" event={"ID":"a8c7be2b-608c-4089-b8a6-76bef69c3588","Type":"ContainerStarted","Data":"51ebfe85afcc3b7f2946066c966ebdfb5ef2285578327fa0c1fd2331c75de2e5"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.880599 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"0ff3f228823f254df81c9400e3bf969b1989214eb6d53eeaa806767239498a57"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.881799 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" event={"ID":"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0","Type":"ContainerStarted","Data":"2b23ab3e26964ba12243f80dc785e3757a7616b853625567abe3a07d108fa2ab"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.883443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" event={"ID":"141fc694-b9ce-4b84-9e39-0e79a487e398","Type":"ContainerStarted","Data":"d73551542a94ae92898d6c7f60f43b5e7b07f43a7fae03dedec4b045380c2e9a"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.883475 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" event={"ID":"141fc694-b9ce-4b84-9e39-0e79a487e398","Type":"ContainerStarted","Data":"a4b9a606d9fdab7476c0a6affdc78e2ff079905daef0c4e0b4ceda9a089c39d6"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.884633 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" event={"ID":"2be1cb07-55b6-4220-989e-13415c3156b2","Type":"ContainerStarted","Data":"8df7f254cdc361cd7a84eb9568ef8a92c58bfb920fd5787cd92bbf9eb19b0868"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.885449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"7503f9d76e1ead024b2d9e32c270ed5c7994c52e76c635dedfba01368986250e"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.887192 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-58c66" event={"ID":"bfa92863-23f8-42d4-8e73-433bf546d304","Type":"ContainerStarted","Data":"a2bffc41930aae799298676f6731be7f1a78453e81f87a04e4c86069af5275cd"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.891632 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podStartSLOduration=154.891604888 podStartE2EDuration="2m34.891604888s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:32.88887093 +0000 UTC m=+227.008503173" watchObservedRunningTime="2026-03-08 00:09:32.891604888 +0000 UTC m=+227.011237111" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.898970 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"112d9d26a15ed14170c83bc124ad4a214a7baca62e66a05d9828873540b36a76"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.899989 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"e5e4ce108e48921131f575c6266cdd05f448c77b1476fcea8f79ebd51be164e8"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.902038 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" event={"ID":"0dbf7b38-8980-49e5-956c-08e443912846","Type":"ContainerStarted","Data":"4e425132b6bddb6f03bc89cb121ccf34d1db0552ad0b4d517b5706e92cc33ab3"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.902938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"9a9c988848cea61452547df38ee81f4d9d10b67c33f46376e69f961257d0ca10"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.904246 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drs4q" event={"ID":"548e19ee-14eb-4075-b9e3-69178800837c","Type":"ContainerStarted","Data":"5fba0849bd6ff6d74f814a7c60b06c8112cccf8bb3be1dcd07c57c070cebdb3a"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.905354 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" event={"ID":"69b6d0bc-e512-432d-9a6f-f79318c0f571","Type":"ContainerStarted","Data":"a45c92beedbf0140113aefd9290a111f882f2b9dd8f6241440aabf1ff34df979"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.906163 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerStarted","Data":"0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.906904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"b1f7244b40627128be2dcf7963c65b437ef73ede8622ce1ef24a4d1d33b02497"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.908428 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" event={"ID":"0e43994e-0aa1-4541-bce9-502bbc1dc0a0","Type":"ContainerStarted","Data":"4463a907ae7393ef0e3efdac52e43e38ff1a3c88f6572b9c8af64744303321a8"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.909210 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"190a60dacb57686f7527fd359dcbee53cb27d86651b512ad3ef2e82c71e60229"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.909967 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" event={"ID":"899ec382-6c79-460e-9e3c-9dfb25867855","Type":"ContainerStarted","Data":"be2e83b64ebb1f15ce7422655ee6ab80fd10154ea455c673dcb802f1fea0d293"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.910787 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerStarted","Data":"f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08"} Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911211 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7snq7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911427 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911466 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911614 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4xznw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911642 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911874 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911881 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.911917 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:32 crc kubenswrapper[4713]: I0308 00:09:32.965298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:32 crc kubenswrapper[4713]: E0308 00:09:32.965608 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.465594437 +0000 UTC m=+227.585226670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.066592 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.066744 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.566711638 +0000 UTC m=+227.686343871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.067332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.068106 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.568096083 +0000 UTC m=+227.687728316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.168890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.169211 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.669193093 +0000 UTC m=+227.788825316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.269727 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podStartSLOduration=155.269711739 podStartE2EDuration="2m35.269711739s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.26775911 +0000 UTC m=+227.387391343" watchObservedRunningTime="2026-03-08 00:09:33.269711739 +0000 UTC m=+227.389343972" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.270087 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.770076548 +0000 UTC m=+227.889708781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.269849 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.326322 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-z4s84" podStartSLOduration=155.326305141 podStartE2EDuration="2m35.326305141s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.322121426 +0000 UTC m=+227.441753679" watchObservedRunningTime="2026-03-08 00:09:33.326305141 +0000 UTC m=+227.445937374" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.361043 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29548800-ghv4d" podStartSLOduration=156.361025893 podStartE2EDuration="2m36.361025893s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.359119835 +0000 UTC m=+227.478752068" watchObservedRunningTime="2026-03-08 00:09:33.361025893 +0000 UTC m=+227.480658126" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.371129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.371472 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.871457515 +0000 UTC m=+227.991089748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.371681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.371951 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.871942778 +0000 UTC m=+227.991575011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.404689 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" podStartSLOduration=155.40467388 podStartE2EDuration="2m35.40467388s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.402486245 +0000 UTC m=+227.522118498" watchObservedRunningTime="2026-03-08 00:09:33.40467388 +0000 UTC m=+227.524306113" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.444707 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podStartSLOduration=155.444691366 podStartE2EDuration="2m35.444691366s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:33.442265945 +0000 UTC m=+227.561898178" watchObservedRunningTime="2026-03-08 00:09:33.444691366 +0000 UTC m=+227.564323609" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.473171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.473394 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.973365026 +0000 UTC m=+228.092997259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.473521 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.473892 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:33.973879419 +0000 UTC m=+228.093511652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.657450 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.657791 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.157772418 +0000 UTC m=+228.277404651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.759363 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.759626 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.259614989 +0000 UTC m=+228.379247222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.860895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.861072 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.361045206 +0000 UTC m=+228.480677439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.861397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.861668 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.361659332 +0000 UTC m=+228.481291565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.915944 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-2k6nd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.915998 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" podUID="00793875-21cf-4a6e-8da2-2d94bd3725c4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.962166 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.962337 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.462319411 +0000 UTC m=+228.581951644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:33 crc kubenswrapper[4713]: I0308 00:09:33.962423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:33 crc kubenswrapper[4713]: E0308 00:09:33.963210 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.463192763 +0000 UTC m=+228.582824996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.063039 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.063233 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.563203206 +0000 UTC m=+228.682835439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.063331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.063618 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.563604616 +0000 UTC m=+228.683236849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.164015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.164266 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.664231384 +0000 UTC m=+228.783863617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.164400 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.164885 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.664871281 +0000 UTC m=+228.784503514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.265302 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.265493 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.765463588 +0000 UTC m=+228.885095821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.265621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.266056 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.766040363 +0000 UTC m=+228.885672596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.366111 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.366398 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.866380734 +0000 UTC m=+228.986012967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.478461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.479276 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:34.97925405 +0000 UTC m=+229.098886283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.500308 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.500361 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.579795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.580361 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.08034195 +0000 UTC m=+229.199974193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.656309 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.656410 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.657644 4713 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-l464l container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.657684 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" podUID="c61cbc0b-441e-4704-accf-35963b3758aa" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.681509 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.682757 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.182740343 +0000 UTC m=+229.302372576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.782900 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.783014 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.282996472 +0000 UTC m=+229.402628705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.783398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.783760 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.283749341 +0000 UTC m=+229.403381574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.885178 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:34 crc kubenswrapper[4713]: E0308 00:09:34.885913 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.385892647 +0000 UTC m=+229.505524880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.932958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"cb11a6658b39cb703d8113bf5a062563b52b88c1bbd96ee7254651b3846fcc57"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.941159 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"2a68097e188634237fb4d5e58d360c20797f8f0410061c29d2759430b638f631"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.943387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" event={"ID":"cb14cb41-8f32-4fd8-9eb8-2446ddfd85e0","Type":"ContainerStarted","Data":"03c209db335e58ea5662b7255481b43b8d7ba579b7f2816ef681de60076745f6"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.946421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" event={"ID":"2be1cb07-55b6-4220-989e-13415c3156b2","Type":"ContainerStarted","Data":"b9ed1e36977e077482671111c31c7d2ed9d272672f4b5cc953db2d76ad581370"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.948569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerStarted","Data":"fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.950119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"a517b5241ccbf241e1f4fe7609545a13698dd49b10242725eaeb8822a82084d8"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.957127 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" event={"ID":"ee63f184-4609-43d4-bdc1-2c840aef6d7f","Type":"ContainerStarted","Data":"a8b4209283dacd63ee8a200d4e5a6a96337e44c09b55c6b835f3ad418c0ad093"} Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960141 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960243 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960293 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8m94r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.960332 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podUID="0d2f415a-2626-45f9-baf0-68ab25b9d079" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.961397 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bn56j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.961445 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podUID="5eb834dd-5358-45c4-bbca-50baf0e8656b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 08 00:09:34 crc kubenswrapper[4713]: I0308 00:09:34.977870 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4cd9v" podStartSLOduration=156.977845528 podStartE2EDuration="2m36.977845528s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:34.973440387 +0000 UTC m=+229.093072620" watchObservedRunningTime="2026-03-08 00:09:34.977845528 +0000 UTC m=+229.097477771" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.002451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.002752 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.502736323 +0000 UTC m=+229.622368556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.150567 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.150970 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.650952598 +0000 UTC m=+229.770584831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.306208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.306589 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.806569538 +0000 UTC m=+229.926201841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.313593 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmgvw" podStartSLOduration=157.313576854 podStartE2EDuration="2m37.313576854s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:34.996862336 +0000 UTC m=+229.116494569" watchObservedRunningTime="2026-03-08 00:09:35.313576854 +0000 UTC m=+229.433209087" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.314713 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvsbq" podStartSLOduration=157.314705592 podStartE2EDuration="2m37.314705592s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.312513967 +0000 UTC m=+229.432146200" watchObservedRunningTime="2026-03-08 00:09:35.314705592 +0000 UTC m=+229.434337825" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.320799 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.338279 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podStartSLOduration=157.338256914 podStartE2EDuration="2m37.338256914s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.337731861 +0000 UTC m=+229.457364094" watchObservedRunningTime="2026-03-08 00:09:35.338256914 +0000 UTC m=+229.457889157" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.356102 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podStartSLOduration=157.356083472 podStartE2EDuration="2m37.356083472s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.355022755 +0000 UTC m=+229.474654988" watchObservedRunningTime="2026-03-08 00:09:35.356083472 +0000 UTC m=+229.475715705" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.376952 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sxbdk" podStartSLOduration=7.376931126 podStartE2EDuration="7.376931126s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.374903075 +0000 UTC m=+229.494535318" watchObservedRunningTime="2026-03-08 00:09:35.376931126 +0000 UTC m=+229.496563389" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.397874 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-drs4q" podStartSLOduration=157.397851841 podStartE2EDuration="2m37.397851841s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.394139298 +0000 UTC m=+229.513771531" watchObservedRunningTime="2026-03-08 00:09:35.397851841 +0000 UTC m=+229.517484074" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.407008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.408514 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:35.908491429 +0000 UTC m=+230.028123662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.420813 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-58c66" podStartSLOduration=158.420797378 podStartE2EDuration="2m38.420797378s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.420154222 +0000 UTC m=+229.539786455" watchObservedRunningTime="2026-03-08 00:09:35.420797378 +0000 UTC m=+229.540429611" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.508649 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.508989 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.008971613 +0000 UTC m=+230.128603846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.609912 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.610288 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.110271639 +0000 UTC m=+230.229903872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.711808 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.712190 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.212172439 +0000 UTC m=+230.331804672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.813367 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.813551 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.313503395 +0000 UTC m=+230.433135628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.813610 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.813989 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.313972687 +0000 UTC m=+230.433604910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887297 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887608 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.887649 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.915033 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:35 crc kubenswrapper[4713]: E0308 00:09:35.915312 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.415297693 +0000 UTC m=+230.534929926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.964247 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" event={"ID":"ccf0e825-0465-40ae-b0ca-f4f7c377e518","Type":"ContainerStarted","Data":"b1bd8cefe222cc7b85756393bbccec0bebade9d8bd0e8902a6b8e0a194d2fc57"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.965691 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmjhj" event={"ID":"158ba4b3-9da3-4a83-95dd-e625c7b19a2b","Type":"ContainerStarted","Data":"ddfcb2d55f56fcd69cf955f63872c49317a99abe32c31680854a4c6388206952"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.967103 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerStarted","Data":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.967305 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968502 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerStarted","Data":"c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968804 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8gbn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.968850 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.970169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"7c7edf766cc4bfbce05c51380f357c719f0be9f041874a17dca5fed8d540a66e"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.972024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" event={"ID":"fd936d68-81ed-4923-8078-5ad0116d532e","Type":"ContainerStarted","Data":"875521de81715b88c169372fab2ed2cb0adebaeaadaacf944e8db61b0f28cd19"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.973289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" event={"ID":"0dbf7b38-8980-49e5-956c-08e443912846","Type":"ContainerStarted","Data":"fa32a54cb695b8a35913b6b0e2a5406f92837e60651424b5ca87b3e7dc75adff"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.975331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" event={"ID":"899ec382-6c79-460e-9e3c-9dfb25867855","Type":"ContainerStarted","Data":"c08b2fb485dc1ec5c4dcc92d157f7f830eab40b020da577c868ec0e26f18d3e1"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.977018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" event={"ID":"0e43994e-0aa1-4541-bce9-502bbc1dc0a0","Type":"ContainerStarted","Data":"d51e6fd41ac9c40899b923adfbe32076a9b6cc968bf920ed04170b7bfe90da00"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.981288 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xmjhj" podStartSLOduration=7.9812748110000005 podStartE2EDuration="7.981274811s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:35.979450645 +0000 UTC m=+230.099082888" watchObservedRunningTime="2026-03-08 00:09:35.981274811 +0000 UTC m=+230.100907044" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.984131 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" event={"ID":"6e21b584-0781-4fa9-8811-332d42755c17","Type":"ContainerStarted","Data":"fff4729dff8af17b584d22a6436f22684389579715022ba86586f8cca9f4618d"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.986715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"3e011924e2fe3315854a0f3623269b9572c982571676ed5a2133605ddc8f6b2e"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.989032 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" event={"ID":"f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc","Type":"ContainerStarted","Data":"e379738a4bef0a60ed14f3cf8d8a3c30d4a82ab1f64b9b1d40ccc937816c8a85"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.990374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" event={"ID":"c9f8ace1-247f-4128-b3f7-95037fb1a156","Type":"ContainerStarted","Data":"75d05f92fb5abe52844bbae56dec71015f076443bb20f74c35d27309150cfd58"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.991862 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" event={"ID":"3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6","Type":"ContainerStarted","Data":"1c7678d5dbfcf2643ccdb86b5564eb19a218ea616ec11db244e26dbed403cb0b"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.992082 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993501 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g99pk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993540 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podUID="3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.993792 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"2f71a72df5cc338370ced373637ec8de9d7b684f577930629be009740cd59848"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.995638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" event={"ID":"452f8fcb-d31f-41d4-be85-d041d7efc756","Type":"ContainerStarted","Data":"65470464808bdda97e1a5591cd4693db924ffd2ec404d34cc73a8e884cacae00"} Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997477 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8m94r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997510 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" podUID="0d2f415a-2626-45f9-baf0-68ab25b9d079" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997587 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bn56j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 08 00:09:35 crc kubenswrapper[4713]: I0308 00:09:35.997601 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" podUID="5eb834dd-5358-45c4-bbca-50baf0e8656b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.006944 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podStartSLOduration=159.006923755 podStartE2EDuration="2m39.006923755s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.004503595 +0000 UTC m=+230.124135828" watchObservedRunningTime="2026-03-08 00:09:36.006923755 +0000 UTC m=+230.126555988" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.016503 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.017004 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.516976788 +0000 UTC m=+230.636609011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.050717 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5bltg" podStartSLOduration=158.050677535 podStartE2EDuration="2m38.050677535s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.025295857 +0000 UTC m=+230.144928090" watchObservedRunningTime="2026-03-08 00:09:36.050677535 +0000 UTC m=+230.170309768" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.077122 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" podStartSLOduration=159.077107609 podStartE2EDuration="2m39.077107609s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.076202186 +0000 UTC m=+230.195834419" watchObservedRunningTime="2026-03-08 00:09:36.077107609 +0000 UTC m=+230.196739832" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.086109 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p529" podStartSLOduration=158.086035233 podStartE2EDuration="2m38.086035233s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.053942437 +0000 UTC m=+230.173574670" watchObservedRunningTime="2026-03-08 00:09:36.086035233 +0000 UTC m=+230.205667466" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.106130 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podStartSLOduration=158.106106008 podStartE2EDuration="2m38.106106008s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.105342518 +0000 UTC m=+230.224974751" watchObservedRunningTime="2026-03-08 00:09:36.106106008 +0000 UTC m=+230.225738261" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.117258 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.120525 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.620505759 +0000 UTC m=+230.740137992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.124545 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-shncx" podStartSLOduration=158.12452203 podStartE2EDuration="2m38.12452203s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.122174021 +0000 UTC m=+230.241806264" watchObservedRunningTime="2026-03-08 00:09:36.12452203 +0000 UTC m=+230.244154263" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.139025 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podStartSLOduration=158.139004024 podStartE2EDuration="2m38.139004024s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.138360018 +0000 UTC m=+230.257992261" watchObservedRunningTime="2026-03-08 00:09:36.139004024 +0000 UTC m=+230.258636257" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.161771 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pvc8t" podStartSLOduration=158.161752426 podStartE2EDuration="2m38.161752426s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.161561281 +0000 UTC m=+230.281193514" watchObservedRunningTime="2026-03-08 00:09:36.161752426 +0000 UTC m=+230.281384669" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.178929 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7wd77" podStartSLOduration=158.178909887 podStartE2EDuration="2m38.178909887s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.177797279 +0000 UTC m=+230.297429542" watchObservedRunningTime="2026-03-08 00:09:36.178909887 +0000 UTC m=+230.298542120" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.192985 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jhxcl" podStartSLOduration=158.19296447 podStartE2EDuration="2m38.19296447s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.192733974 +0000 UTC m=+230.312366207" watchObservedRunningTime="2026-03-08 00:09:36.19296447 +0000 UTC m=+230.312596703" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.214028 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podStartSLOduration=158.214008319 podStartE2EDuration="2m38.214008319s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.212091341 +0000 UTC m=+230.331723584" watchObservedRunningTime="2026-03-08 00:09:36.214008319 +0000 UTC m=+230.333640552" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.224083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.224461 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.724448241 +0000 UTC m=+230.844080464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.237434 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-c4nq5" podStartSLOduration=158.237410357 podStartE2EDuration="2m38.237410357s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:36.235722814 +0000 UTC m=+230.355355067" watchObservedRunningTime="2026-03-08 00:09:36.237410357 +0000 UTC m=+230.357042590" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.320334 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.325127 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.325252 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.825232994 +0000 UTC m=+230.944865237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.325491 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.325933 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.825915901 +0000 UTC m=+230.945548134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.427200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.427588 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:36.927569705 +0000 UTC m=+231.047201938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.528503 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.529045 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.029027624 +0000 UTC m=+231.148659927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.630199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.630540 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.130523605 +0000 UTC m=+231.250155838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.731322 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.733092 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.233071561 +0000 UTC m=+231.352703844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.833068 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.833413 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.333387652 +0000 UTC m=+231.453019885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.888511 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.888580 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:36 crc kubenswrapper[4713]: I0308 00:09:36.935064 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:36 crc kubenswrapper[4713]: E0308 00:09:36.935498 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.435483747 +0000 UTC m=+231.555115980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.010697 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" event={"ID":"3a74e1e8-3928-4220-b55d-ee42585ef1ee","Type":"ContainerStarted","Data":"272378b14ead6b0fe8f70c4a69a4e8e415883406601525810e82923d770b8d6f"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.013253 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" event={"ID":"d2708ad9-cf03-4a75-9b53-fa4ee96d8fc4","Type":"ContainerStarted","Data":"ecacfcd803dd5b2c9f84eccc1b8c3ca6289b45f4bf70a11e20eed2588dfed870"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.013390 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.021692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lwhnh" event={"ID":"39da2ba4-aebb-485b-8e46-7ffc36efa490","Type":"ContainerStarted","Data":"063872b729e3f32ff6b60124c486f029cdd9345a46169aea92b871431d458350"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.023503 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" event={"ID":"d3811a82-b0fe-4e06-948a-79cbbc840a98","Type":"ContainerStarted","Data":"1544e4356d60f3367c60a34a2f6fa643b4ceb544c5db56eca24d5dd1b21d7db2"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.027249 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" event={"ID":"496a4fbf-c338-4b64-96a5-dda456094c28","Type":"ContainerStarted","Data":"a8dce851bd245a5dbc4a99d4117015c1cf2fed3bca5c996d3702ed4d45852654"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031184 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" event={"ID":"9fed4c23-4a16-4502-87eb-d1dd68aa1af5","Type":"ContainerStarted","Data":"b29a6383591408b33e46513d52dd44a7999f4a23ae697a854adb2bf157892504"} Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031813 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8gbn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.031873 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.032163 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g99pk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.032193 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" podUID="3419fd8b-68a4-4414-b8c1-ee50eaa0d4b6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.033344 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" podStartSLOduration=159.033331356 podStartE2EDuration="2m39.033331356s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.030867044 +0000 UTC m=+231.150499307" watchObservedRunningTime="2026-03-08 00:09:37.033331356 +0000 UTC m=+231.152963589" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.035697 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.036291 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.53627557 +0000 UTC m=+231.655907803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.047013 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q7bjv" podStartSLOduration=159.046992919 podStartE2EDuration="2m39.046992919s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.045529562 +0000 UTC m=+231.165161795" watchObservedRunningTime="2026-03-08 00:09:37.046992919 +0000 UTC m=+231.166625152" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.097145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bltk5" podStartSLOduration=159.097125079 podStartE2EDuration="2m39.097125079s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.078585453 +0000 UTC m=+231.198217696" watchObservedRunningTime="2026-03-08 00:09:37.097125079 +0000 UTC m=+231.216757322" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.118488 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2qwgb" podStartSLOduration=159.118472095 podStartE2EDuration="2m39.118472095s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.098300828 +0000 UTC m=+231.217933061" watchObservedRunningTime="2026-03-08 00:09:37.118472095 +0000 UTC m=+231.238104328" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.120606 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4qpfj" podStartSLOduration=159.120596198 podStartE2EDuration="2m39.120596198s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.117809048 +0000 UTC m=+231.237441281" watchObservedRunningTime="2026-03-08 00:09:37.120596198 +0000 UTC m=+231.240228431" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.137748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.142747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.642730435 +0000 UTC m=+231.762362758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.152325 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdq97" podStartSLOduration=160.152304895 podStartE2EDuration="2m40.152304895s" podCreationTimestamp="2026-03-08 00:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.137011201 +0000 UTC m=+231.256643454" watchObservedRunningTime="2026-03-08 00:09:37.152304895 +0000 UTC m=+231.271937128" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.155283 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xr24g" podStartSLOduration=159.15527509 podStartE2EDuration="2m39.15527509s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.151957017 +0000 UTC m=+231.271589260" watchObservedRunningTime="2026-03-08 00:09:37.15527509 +0000 UTC m=+231.274907333" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.169535 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wld5v" podStartSLOduration=159.169518798 podStartE2EDuration="2m39.169518798s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:37.167241471 +0000 UTC m=+231.286873724" watchObservedRunningTime="2026-03-08 00:09:37.169518798 +0000 UTC m=+231.289151031" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.239412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.239612 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.739579708 +0000 UTC m=+231.859211931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.239676 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.240023 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.740015369 +0000 UTC m=+231.859647602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.341051 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.341238 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.841220092 +0000 UTC m=+231.960852325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.341357 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.341651 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.841644093 +0000 UTC m=+231.961276316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.442486 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.442610 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.942592758 +0000 UTC m=+232.062224991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.442762 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.443053 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:37.94304166 +0000 UTC m=+232.062673883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.543626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.543762 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.04374479 +0000 UTC m=+232.163377023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.543891 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.544146 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.04413914 +0000 UTC m=+232.163771373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.644740 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.644943 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.144915252 +0000 UTC m=+232.264547485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.645200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.645535 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.145527387 +0000 UTC m=+232.265159620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.746032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.746200 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.246170456 +0000 UTC m=+232.365802689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.746454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.746772 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.246759581 +0000 UTC m=+232.366391814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.848087 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.848297 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.348264022 +0000 UTC m=+232.467896255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.848376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.848688 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.348680652 +0000 UTC m=+232.468312875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.888182 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.888273 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.949196 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.949354 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.449320821 +0000 UTC m=+232.568953054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:37 crc kubenswrapper[4713]: I0308 00:09:37.949538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:37 crc kubenswrapper[4713]: E0308 00:09:37.949842 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.449816593 +0000 UTC m=+232.569448886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043099 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043254 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.043505 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.051785 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.051978 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.55195685 +0000 UTC m=+232.671589083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.052472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.054788 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.55477065 +0000 UTC m=+232.674402883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.088275 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lwhnh" podStartSLOduration=11.088254732 podStartE2EDuration="11.088254732s" podCreationTimestamp="2026-03-08 00:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:38.068468035 +0000 UTC m=+232.188100288" watchObservedRunningTime="2026-03-08 00:09:38.088254732 +0000 UTC m=+232.207886975" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.089904 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6swxn" podStartSLOduration=160.089895033 podStartE2EDuration="2m40.089895033s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:38.086491947 +0000 UTC m=+232.206124190" watchObservedRunningTime="2026-03-08 00:09:38.089895033 +0000 UTC m=+232.209527266" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.153540 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.153806 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.653775588 +0000 UTC m=+232.773407821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.154087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.155079 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.65506717 +0000 UTC m=+232.774699403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.255228 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.255393 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.755367221 +0000 UTC m=+232.874999454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.255482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.255919 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.755907394 +0000 UTC m=+232.875539627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.356885 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.357077 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.857049466 +0000 UTC m=+232.976681699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.357150 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.357482 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.857472006 +0000 UTC m=+232.977104309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.457849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.458183 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:38.958166126 +0000 UTC m=+233.077798349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.559101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.559459 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.059445611 +0000 UTC m=+233.179077844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.660145 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.660546 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.160528141 +0000 UTC m=+233.280160374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.761691 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.762045 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.262029141 +0000 UTC m=+233.381661374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.862879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.863001 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.362983668 +0000 UTC m=+233.482615901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.863080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.863377 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.363369148 +0000 UTC m=+233.483001381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.888701 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:38 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:38 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:38 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.888774 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:38 crc kubenswrapper[4713]: I0308 00:09:38.963718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:38 crc kubenswrapper[4713]: E0308 00:09:38.964097 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.464081898 +0000 UTC m=+233.583714131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.065042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.065368 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.565357363 +0000 UTC m=+233.684989596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.166698 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.167423 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.667402857 +0000 UTC m=+233.787035090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.268894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.269308 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.769289877 +0000 UTC m=+233.888922110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321192 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321192 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k5mg9 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321242 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.321305 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" podUID="452f8fcb-d31f-41d4-be85-d041d7efc756" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.371705 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.372130 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.872111311 +0000 UTC m=+233.991743544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.473302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.473677 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:39.973662433 +0000 UTC m=+234.093294666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.574633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.574801 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.074783214 +0000 UTC m=+234.194415447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.574914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.575187 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.075180844 +0000 UTC m=+234.194813077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.671136 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.675440 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.675554 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.175530495 +0000 UTC m=+234.295162728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.676125 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.17611752 +0000 UTC m=+234.295749753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.675895 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.682095 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l464l" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.767868 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.768609 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.777280 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.777505 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.277472906 +0000 UTC m=+234.397105139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.777836 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.778335 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.278313018 +0000 UTC m=+234.397945251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.830916 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.879412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.880404 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.379591062 +0000 UTC m=+234.499223295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.880523 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.880930 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.380919636 +0000 UTC m=+234.500551869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.898620 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:39 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:39 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:39 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.898670 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.957520 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:39 crc kubenswrapper[4713]: I0308 00:09:39.981752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:39 crc kubenswrapper[4713]: E0308 00:09:39.982868 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.482835867 +0000 UTC m=+234.602468100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.071331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"7470769f57edb813356a2be9d5379cbabe535bd2a0b0a02d545f7198d60d26db"} Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.087190 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51858: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.087630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.088729 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.588712707 +0000 UTC m=+234.708344940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.189168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.189387 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.689356356 +0000 UTC m=+234.808988589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.189534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.189956 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.689944621 +0000 UTC m=+234.809576854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.220017 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51870: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.291045 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.291352 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.791336558 +0000 UTC m=+234.910968791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.301357 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.302178 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.302211 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303068 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303570 4713 patch_prober.go:28] interesting pod/console-f9d7485db-gk97q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.303635 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gk97q" podUID="1d068555-56f2-4bcf-8b4c-cc574ad087fa" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: W0308 00:09:40.307009 4713 reflector.go:561] object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n": failed to list *v1.Secret: secrets "installer-sa-dockercfg-5pr6n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-kube-apiserver": no relationship found between node 'crc' and this object Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.307056 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-5pr6n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installer-sa-dockercfg-5pr6n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 08 00:09:40 crc kubenswrapper[4713]: W0308 00:09:40.307118 4713 reflector.go:561] object-"openshift-kube-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-kube-apiserver": no relationship found between node 'crc' and this object Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.307134 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.318200 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.341993 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342042 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342254 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.342268 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.367024 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51876: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392204 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392397 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.392476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.393523 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.893511356 +0000 UTC m=+235.013143589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.397885 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2k6nd" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.455629 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51890: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.493200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.494074 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.993371055 +0000 UTC m=+235.113003288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494288 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.494417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.494910 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:40.994902243 +0000 UTC m=+235.114534476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.495047 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.527188 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.528377 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.532061 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.572139 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.572476 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51906: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597386 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.597684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.597794 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.097777478 +0000 UTC m=+235.217409711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.628134 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bn56j" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.653033 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8m94r" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.699316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.699561 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.199547125 +0000 UTC m=+235.319179358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.701639 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.702313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.723843 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.725063 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.730230 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.735661 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51908: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.765014 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.784858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"certified-operators-x6gcb\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804454 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804694 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.804719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.804884 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.304869082 +0000 UTC m=+235.424501315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.843925 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.886326 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.889153 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:40 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:40 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:40 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.889193 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.905691 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:40 crc kubenswrapper[4713]: E0308 00:09:40.905970 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.405958742 +0000 UTC m=+235.525590975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.906316 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.906526 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.915114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.915970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.933008 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.940703 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.946367 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.954527 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51912: no serving certificate available for the kubelet" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.964494 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"community-operators-4tj99\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:40 crc kubenswrapper[4713]: I0308 00:09:40.994900 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.008636 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.008951 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.009022 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.009088 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.009186 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.509171334 +0000 UTC m=+235.628803567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.040856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.059073 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.073154 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.101078 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.101281 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.115934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.116024 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.116108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.119382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.119907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.124513 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.124775 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.624762279 +0000 UTC m=+235.744394512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.125637 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.138468 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.150203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.174929 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.186413 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"certified-operators-x7pkf\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221414 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221776 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221848 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221881 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.221922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.222097 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.722081754 +0000 UTC m=+235.841713987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.238189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324280 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324327 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.324396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325008 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.325553 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.825543194 +0000 UTC m=+235.945175427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.325678 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.350767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"community-operators-pd9br\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.363816 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51926: no serving certificate available for the kubelet" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.372129 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.405211 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.416394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.427625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.428043 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:41.928027899 +0000 UTC m=+236.047660132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.443715 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.461500 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g99pk" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.470016 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: W0308 00:09:41.471264 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9341928_7a63_4190_ac37_ac9ba3320e18.slice/crio-8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81 WatchSource:0}: Error finding container 8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81: Status 404 returned error can't find the container with id 8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81 Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.484111 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.501528 4713 patch_prober.go:28] interesting pod/apiserver-76f77b778f-58c66 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]log ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]etcd ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/max-in-flight-filter ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 00:09:41 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-startinformers ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 00:09:41 crc kubenswrapper[4713]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 00:09:41 crc kubenswrapper[4713]: livez check failed Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.501589 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-58c66" podUID="bfa92863-23f8-42d4-8e73-433bf546d304" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.529165 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.529556 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.029540119 +0000 UTC m=+236.149172352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.631334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.631735 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.131712987 +0000 UTC m=+236.251345220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.694184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.696967 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.733642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.734223 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.234212172 +0000 UTC m=+236.353844405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.838638 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.839001 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.338986235 +0000 UTC m=+236.458618468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.912076 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.940684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:41 crc kubenswrapper[4713]: E0308 00:09:41.941166 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.441138672 +0000 UTC m=+236.560770905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.971988 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:41 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:41 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:41 crc kubenswrapper[4713]: I0308 00:09:41.972037 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.045168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.045443 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.545416382 +0000 UTC m=+236.665048605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.045626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.046717 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.546705784 +0000 UTC m=+236.666338017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.066360 4713 ???:1] "http: TLS handshake error from 192.168.126.11:51932: no serving certificate available for the kubelet" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.146865 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.147366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.147636 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.64762139 +0000 UTC m=+236.767253623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.167424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"3cdea3678803ad7453d0a386b7a4a0468a866e4a3767422ad83b05a97ef4bf14"} Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.179254 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4a956b_6edb_436e_bd5e_5d57899c2ea1.slice/crio-135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9 WatchSource:0}: Error finding container 135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9: Status 404 returned error can't find the container with id 135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db"} Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179705 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db" exitCode=0 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.179900 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81"} Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.197742 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.216125 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.237460 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.237720 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" containerID="cri-o://9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" gracePeriod=30 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.245049 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.245320 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" containerID="cri-o://a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" gracePeriod=30 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.249383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.249669 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.749658024 +0000 UTC m=+236.869290257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.297420 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33b42a1_bf95_490f_a907_765855ec81d1.slice/crio-8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4 WatchSource:0}: Error finding container 8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4: Status 404 returned error can't find the container with id 8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.303635 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.325438 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k5mg9" Mar 08 00:09:42 crc kubenswrapper[4713]: W0308 00:09:42.331088 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde40fceb_b995_45d6_8272_3a93c1b85bc8.slice/crio-43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4 WatchSource:0}: Error finding container 43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4: Status 404 returned error can't find the container with id 43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4 Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.350357 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.351092 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.851073612 +0000 UTC m=+236.970705845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.453448 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.453772 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:42.953752382 +0000 UTC m=+237.073384615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.518890 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.521386 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.524911 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.525479 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.555286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.555497 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.055477158 +0000 UTC m=+237.175109391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.555645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.556385 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.05636787 +0000 UTC m=+237.176000103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.656797 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.656952 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.156922957 +0000 UTC m=+237.276555190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657061 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.657174 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.657467 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.15745562 +0000 UTC m=+237.277087853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.758631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.758916 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.759656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.759747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.25972849 +0000 UTC m=+237.379360723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.760436 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.777114 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"redhat-marketplace-5hssk\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.816958 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.862289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.862669 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.362652226 +0000 UTC m=+237.482284459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.892704 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:42 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:42 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:42 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.892746 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.933899 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.934938 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.948603 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:42 crc kubenswrapper[4713]: I0308 00:09:42.963518 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:42 crc kubenswrapper[4713]: E0308 00:09:42.964566 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.464550866 +0000 UTC m=+237.584183099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.065606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.065975 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.066032 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.066091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.066358 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.566347394 +0000 UTC m=+237.685979627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.166673 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.166860 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.666814468 +0000 UTC m=+237.786446701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.166980 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167099 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167146 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.167582 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.167637 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.667629268 +0000 UTC m=+237.787261491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.168029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.175908 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186751 4713 generic.go:334] "Generic (PLEG): container finished" podID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186888 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerDied","Data":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.186963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" event={"ID":"c5cc5125-93f0-4709-afbd-7aa6a888b641","Type":"ContainerDied","Data":"4dcd3efc63c2bb82108f5db86db8f7d5ce1c4ffb7c4a91ed149a6c9ab7e1050e"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.187160 4713 scope.go:117] "RemoveContainer" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.187288 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.191627 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.192491 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"redhat-marketplace-hs88q\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.198909 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerStarted","Data":"4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.198953 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerStarted","Data":"43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.210124 4713 generic.go:334] "Generic (PLEG): container finished" podID="2a04a017-1594-43d7-a796-8c676b28095e" containerID="c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.210201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerDied","Data":"c8ec75cd7a186f4467889f8e0fcfe9eae850fd7f8f43899ce233be5db2fb4c2c"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.219263 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"f9994e738e641d54be6f247f3a1e0358bcb1b2e919a54e81e49a4879ccbc6546"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.223562 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.223727 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.226815 4713 scope.go:117] "RemoveContainer" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.228125 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": container with ID starting with a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638 not found: ID does not exist" containerID="a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.228155 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638"} err="failed to get container status \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": rpc error: code = NotFound desc = could not find container \"a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638\": container with ID starting with a68b4ccfdfbaf91b0589175f60e09a31251dadc4c8962143c6e936d1c65c0638 not found: ID does not exist" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.233284 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.233199706 podStartE2EDuration="3.233199706s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:43.232764575 +0000 UTC m=+237.352396808" watchObservedRunningTime="2026-03-08 00:09:43.233199706 +0000 UTC m=+237.352831939" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.244965 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.245065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.245093 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.258733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.262172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerStarted","Data":"ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.262241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerStarted","Data":"55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.265779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267711 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267741 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267809 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267854 4713 generic.go:334] "Generic (PLEG): container finished" podID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267909 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerDied","Data":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267926 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267940 4713 scope.go:117] "RemoveContainer" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268047 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") pod \"c5cc5125-93f0-4709-afbd-7aa6a888b641\" (UID: \"c5cc5125-93f0-4709-afbd-7aa6a888b641\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268125 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") pod \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\" (UID: \"e4ba1fb6-83e1-4a29-93a5-5abf00f86718\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268706 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.269161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca" (OuterVolumeSpecName: "client-ca") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.269604 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.76958413 +0000 UTC m=+237.889216433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.267928 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" event={"ID":"e4ba1fb6-83e1-4a29-93a5-5abf00f86718","Type":"ContainerDied","Data":"a48c3b313279a8d19f79d36e4fdb5a5265b310ba5fe079364f758a6f08817617"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.270176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config" (OuterVolumeSpecName: "config") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.271137 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.274938 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config" (OuterVolumeSpecName: "config") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276504 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622" exitCode=0 Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276547 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.276593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9"} Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.268013 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4xznw" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282299 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282557 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.282633 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5" (OuterVolumeSpecName: "kube-api-access-fzcz5") pod "c5cc5125-93f0-4709-afbd-7aa6a888b641" (UID: "c5cc5125-93f0-4709-afbd-7aa6a888b641"). InnerVolumeSpecName "kube-api-access-fzcz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.287329 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc" (OuterVolumeSpecName: "kube-api-access-549nc") pod "e4ba1fb6-83e1-4a29-93a5-5abf00f86718" (UID: "e4ba1fb6-83e1-4a29-93a5-5abf00f86718"). InnerVolumeSpecName "kube-api-access-549nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:43 crc kubenswrapper[4713]: W0308 00:09:43.291229 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822fdb72_7e7f_441b_8ebc_178ef46cca73.slice/crio-fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f WatchSource:0}: Error finding container fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f: Status 404 returned error can't find the container with id fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.306022 4713 scope.go:117] "RemoveContainer" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.307154 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": container with ID starting with 9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba not found: ID does not exist" containerID="9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.307178 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba"} err="failed to get container status \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": rpc error: code = NotFound desc = could not find container \"9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba\": container with ID starting with 9536e9b3624c06646894a8bbf0b9ca445d2a94426c01c655b1f4a1a1e29602ba not found: ID does not exist" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.318167 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.31812236 podStartE2EDuration="3.31812236s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:43.3181387 +0000 UTC m=+237.437770943" watchObservedRunningTime="2026-03-08 00:09:43.31812236 +0000 UTC m=+237.437754593" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.369650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.369863 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549nc\" (UniqueName: \"kubernetes.io/projected/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-kube-api-access-549nc\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.370301 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.870066145 +0000 UTC m=+237.989698378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370453 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370488 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370499 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370512 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5cc5125-93f0-4709-afbd-7aa6a888b641-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370521 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370529 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc5125-93f0-4709-afbd-7aa6a888b641-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370539 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ba1fb6-83e1-4a29-93a5-5abf00f86718-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.370549 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzcz5\" (UniqueName: \"kubernetes.io/projected/c5cc5125-93f0-4709-afbd-7aa6a888b641-kube-api-access-fzcz5\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.381889 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56422: no serving certificate available for the kubelet" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.471469 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.471586 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.971566966 +0000 UTC m=+238.091199209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.472211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.472512 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:43.972502009 +0000 UTC m=+238.092134242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.517361 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.519922 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7snq7"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.573495 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.573652 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.07362908 +0000 UTC m=+238.193261313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.573776 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.574051 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.07403995 +0000 UTC m=+238.193672173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.604038 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.606435 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4xznw"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.674773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.675232 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.175214262 +0000 UTC m=+238.294846495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.694378 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:09:43 crc kubenswrapper[4713]: W0308 00:09:43.769567 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef0ec0c_d1f7_4ed1_81d8_fe12497c15b0.slice/crio-6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a WatchSource:0}: Error finding container 6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a: Status 404 returned error can't find the container with id 6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.775905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.776193 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.276180739 +0000 UTC m=+238.395812972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.877088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.877308 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.37727731 +0000 UTC m=+238.496909543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.877426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.877797 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.377784672 +0000 UTC m=+238.497416985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.889171 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:43 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:43 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:43 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.889504 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.909503 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.910343 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910363 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.910402 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910412 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910747 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" containerName="route-controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.910774 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" containerName="controller-manager" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.913450 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.915803 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.924866 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978587 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.978874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:43 crc kubenswrapper[4713]: E0308 00:09:43.978953 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.478905563 +0000 UTC m=+238.598537796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:43 crc kubenswrapper[4713]: I0308 00:09:43.979120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080693 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080741 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.080819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.081492 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.082321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.082428 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.582411494 +0000 UTC m=+238.702043727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.100153 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"redhat-operators-57pjt\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.182406 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.182545 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.682528799 +0000 UTC m=+238.802161032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.182755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.183016 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.683007631 +0000 UTC m=+238.802639854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.237397 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.271714 4713 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.283607 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.283747 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.783722342 +0000 UTC m=+238.903354575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.283896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.284168 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.784158053 +0000 UTC m=+238.903790276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.286710 4713 generic.go:334] "Generic (PLEG): container finished" podID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerID="4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.286764 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerDied","Data":"4ff9eb52dff6453e29d770097f03f20f6662ef54a0468dd632573c2f6fb34657"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288143 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.288202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.290284 4713 generic.go:334] "Generic (PLEG): container finished" podID="64aa73b3-797b-405e-b2ca-db772f204659" containerID="ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.290365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerDied","Data":"ecd142315e97875bdcb7f48882fb2a26c6170c9668052fcb6053cd5ffcce8723"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293578 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" exitCode=0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.293656 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerStarted","Data":"fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.297404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"7f7d7a7a5f5312cb47aeedd31881890eb92d61d686058c3f78862dbedd1bf7b0"} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.321583 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.333367 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.333814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.335593 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.337165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.339412 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340126 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.340256 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.341202 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.343537 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347096 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347110 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347111 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347323 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347489 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347506 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347617 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.347726 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.351246 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.353412 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.356594 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.385566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.387179 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.887157591 +0000 UTC m=+239.006789834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487550 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487617 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487669 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487791 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487840 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487865 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.487965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.488036 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.488133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.488241 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:44.988226481 +0000 UTC m=+239.107858784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.548318 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc5125-93f0-4709-afbd-7aa6a888b641" path="/var/lib/kubelet/pods/c5cc5125-93f0-4709-afbd-7aa6a888b641/volumes" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.549068 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ba1fb6-83e1-4a29-93a5-5abf00f86718" path="/var/lib/kubelet/pods/e4ba1fb6-83e1-4a29-93a5-5abf00f86718/volumes" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.589536 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 00:09:45.089498564 +0000 UTC m=+239.209130797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589604 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589686 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589863 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589915 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.589936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.590355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: E0308 00:09:44.590899 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 00:09:45.090885849 +0000 UTC m=+239.210518162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bnx6n" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591264 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591809 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.591809 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.592211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.592145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.593032 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.605067 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.607304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"controller-manager-6c6f4b84f7-f59s8\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.607872 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.608992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"route-controller-manager-857fc9cd49-86dkp\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.609475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"redhat-operators-rdgpc\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.665068 4713 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-08T00:09:44.271738831Z","Handler":null,"Name":""} Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.667443 4713 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.667478 4713 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.690730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.691597 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.700207 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.711395 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.719445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.775725 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.781881 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-58c66" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.792552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.828292 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.828433 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.898629 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:44 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:44 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:44 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.898704 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:44 crc kubenswrapper[4713]: I0308 00:09:44.900986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bnx6n\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.064886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.307745 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" event={"ID":"063a79dd-fbe8-4562-98bc-deb309b25182","Type":"ContainerStarted","Data":"e8a9049253a3fc1792b0ad8eaa854121335515ac080505e4b1d64d009bd0e53e"} Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.332213 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-q84x9" podStartSLOduration=17.332193306 podStartE2EDuration="17.332193306s" podCreationTimestamp="2026-03-08 00:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:09:45.329726844 +0000 UTC m=+239.449359107" watchObservedRunningTime="2026-03-08 00:09:45.332193306 +0000 UTC m=+239.451825549" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.888505 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:45 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:45 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:45 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.888589 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:45 crc kubenswrapper[4713]: I0308 00:09:45.965354 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56434: no serving certificate available for the kubelet" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.119677 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lwhnh" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.553976 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.890553 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:46 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:46 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:46 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:46 crc kubenswrapper[4713]: I0308 00:09:46.890606 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:47 crc kubenswrapper[4713]: I0308 00:09:47.888192 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:47 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:47 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:47 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:47 crc kubenswrapper[4713]: I0308 00:09:47.888497 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.155001 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56440: no serving certificate available for the kubelet" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.874294 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.882138 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.889020 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:48 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:48 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:48 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.889060 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.901198 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") pod \"de40fceb-b995-45d6-8272-3a93c1b85bc8\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969464 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") pod \"2a04a017-1594-43d7-a796-8c676b28095e\" (UID: \"2a04a017-1594-43d7-a796-8c676b28095e\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969610 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") pod \"de40fceb-b995-45d6-8272-3a93c1b85bc8\" (UID: \"de40fceb-b995-45d6-8272-3a93c1b85bc8\") " Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.969913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de40fceb-b995-45d6-8272-3a93c1b85bc8" (UID: "de40fceb-b995-45d6-8272-3a93c1b85bc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.970420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.975523 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.975860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j" (OuterVolumeSpecName: "kube-api-access-5l55j") pod "2a04a017-1594-43d7-a796-8c676b28095e" (UID: "2a04a017-1594-43d7-a796-8c676b28095e"). InnerVolumeSpecName "kube-api-access-5l55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:48 crc kubenswrapper[4713]: I0308 00:09:48.976059 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de40fceb-b995-45d6-8272-3a93c1b85bc8" (UID: "de40fceb-b995-45d6-8272-3a93c1b85bc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.070487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") pod \"64aa73b3-797b-405e-b2ca-db772f204659\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") pod \"64aa73b3-797b-405e-b2ca-db772f204659\" (UID: \"64aa73b3-797b-405e-b2ca-db772f204659\") " Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071393 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de40fceb-b995-45d6-8272-3a93c1b85bc8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l55j\" (UniqueName: \"kubernetes.io/projected/2a04a017-1594-43d7-a796-8c676b28095e-kube-api-access-5l55j\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071433 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a04a017-1594-43d7-a796-8c676b28095e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071445 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a04a017-1594-43d7-a796-8c676b28095e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.071455 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de40fceb-b995-45d6-8272-3a93c1b85bc8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.070617 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "64aa73b3-797b-405e-b2ca-db772f204659" (UID: "64aa73b3-797b-405e-b2ca-db772f204659"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.074171 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "64aa73b3-797b-405e-b2ca-db772f204659" (UID: "64aa73b3-797b-405e-b2ca-db772f204659"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.172710 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64aa73b3-797b-405e-b2ca-db772f204659-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.172749 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/64aa73b3-797b-405e-b2ca-db772f204659-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337001 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337022 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548800-cclv4" event={"ID":"2a04a017-1594-43d7-a796-8c676b28095e","Type":"ContainerDied","Data":"f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.337352 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f170f29d26ed2ed2fc88befac7041785958542192c67ab73459f56dea209da08" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.341760 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.341763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de40fceb-b995-45d6-8272-3a93c1b85bc8","Type":"ContainerDied","Data":"43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.342014 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d63f0f20049184538f35ad824609d60bca169ae23561e9bd2dd8c3f0364cf4" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"64aa73b3-797b-405e-b2ca-db772f204659","Type":"ContainerDied","Data":"55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8"} Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343506 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.343509 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55ed937bc6c9076c3c9e0296b5b1c3572c62f9313c2870371202bf79e0d60ff8" Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.888782 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:49 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:49 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:49 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:49 crc kubenswrapper[4713]: I0308 00:09:49.888871 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.300977 4713 patch_prober.go:28] interesting pod/console-f9d7485db-gk97q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.301396 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gk97q" podUID="1d068555-56f2-4bcf-8b4c-cc574ad087fa" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341356 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341408 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341431 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.341470 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.889478 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:50 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:50 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:50 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:50 crc kubenswrapper[4713]: I0308 00:09:50.889544 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.105077 4713 ???:1] "http: TLS handshake error from 192.168.126.11:56454: no serving certificate available for the kubelet" Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.889064 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:51 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:51 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:51 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:51 crc kubenswrapper[4713]: I0308 00:09:51.889359 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:52 crc kubenswrapper[4713]: I0308 00:09:52.888650 4713 patch_prober.go:28] interesting pod/router-default-5444994796-drs4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 00:09:52 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 08 00:09:52 crc kubenswrapper[4713]: [+]process-running ok Mar 08 00:09:52 crc kubenswrapper[4713]: healthz check failed Mar 08 00:09:52 crc kubenswrapper[4713]: I0308 00:09:52.888706 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drs4q" podUID="548e19ee-14eb-4075-b9e3-69178800837c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 00:09:54 crc kubenswrapper[4713]: I0308 00:09:54.010408 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:54 crc kubenswrapper[4713]: I0308 00:09:54.013773 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-drs4q" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.352855 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.354322 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.369404 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de296b-0485-4f21-abf9-51043545b565-metrics-certs\") pod \"network-metrics-daemon-9klvz\" (UID: \"02de296b-0485-4f21-abf9-51043545b565\") " pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.457748 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:09:55 crc kubenswrapper[4713]: I0308 00:09:55.466662 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9klvz" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.653272 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.653680 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 00:09:58 crc kubenswrapper[4713]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 08 00:09:58 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrkff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29548808-nd57l_openshift-infra(fdccd72c-79d7-4388-926e-0539c571dafe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 08 00:09:58 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 08 00:09:58 crc kubenswrapper[4713]: E0308 00:09:58.654881 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29548808-nd57l" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" Mar 08 00:09:59 crc kubenswrapper[4713]: I0308 00:09:59.162450 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:09:59 crc kubenswrapper[4713]: I0308 00:09:59.206329 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:09:59 crc kubenswrapper[4713]: E0308 00:09:59.394231 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29548808-nd57l" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125457 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125701 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125717 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125731 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125737 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: E0308 00:10:00.125752 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125899 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a04a017-1594-43d7-a796-8c676b28095e" containerName="collect-profiles" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125916 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="de40fceb-b995-45d6-8272-3a93c1b85bc8" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.125927 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="64aa73b3-797b-405e-b2ca-db772f204659" containerName="pruner" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.126384 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.130723 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.136332 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.220319 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.321379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.337433 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.338897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"auto-csr-approver-29548810-lnmdz\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340811 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340847 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gk97q" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340867 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340873 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340906 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.340921 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341274 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} pod="openshift-console/downloads-7954f5f757-z4s84" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341314 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" containerID="cri-o://0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4" gracePeriod=2 Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341463 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.341501 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:00 crc kubenswrapper[4713]: I0308 00:10:00.449180 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:01 crc kubenswrapper[4713]: I0308 00:10:01.397239 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:01 crc kubenswrapper[4713]: I0308 00:10:01.412194 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:02 crc kubenswrapper[4713]: I0308 00:10:02.423288 4713 generic.go:334] "Generic (PLEG): container finished" podID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerID="0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4" exitCode=0 Mar 08 00:10:02 crc kubenswrapper[4713]: I0308 00:10:02.423335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerDied","Data":"0e456590ed6aec138d6c2be36909b347ef8e66d85928a8221898c7ed939f09c4"} Mar 08 00:10:04 crc kubenswrapper[4713]: I0308 00:10:04.501323 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:10:04 crc kubenswrapper[4713]: I0308 00:10:04.501405 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:10:07 crc kubenswrapper[4713]: I0308 00:10:07.963758 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.342934 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.343288 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:10 crc kubenswrapper[4713]: I0308 00:10:10.968213 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h5mxt" Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.475723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerStarted","Data":"f276e2b1a7d3ec5d946c0b825a48087cfddd233e9465ddce823aae24d96aed33"} Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.478579 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"7c30588800e0dac5ab38807a23f6184382c53099e569400f6073fb7739048d46"} Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.490024 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.604735 4713 ???:1] "http: TLS handshake error from 192.168.126.11:55082: no serving certificate available for the kubelet" Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.730588 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:11 crc kubenswrapper[4713]: I0308 00:10:11.771124 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:10:14 crc kubenswrapper[4713]: W0308 00:10:14.030162 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcde95f7_8814_4319_8a48_6d186de5f51f.slice/crio-ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e WatchSource:0}: Error finding container ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e: Status 404 returned error can't find the container with id ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e Mar 08 00:10:14 crc kubenswrapper[4713]: W0308 00:10:14.037278 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a8aac8_a3d8_45c3_a4f2_6420f4740ac9.slice/crio-bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d WatchSource:0}: Error finding container bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d: Status 404 returned error can't find the container with id bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.497169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e"} Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.498927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerStarted","Data":"bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d"} Mar 08 00:10:14 crc kubenswrapper[4713]: I0308 00:10:14.500080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerStarted","Data":"409ade3b4669dbf5f8873e64f32cc4c3239e1b04d6422acbe8d91847c500cbde"} Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.887666 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.888360 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.888464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.890868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.891481 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.930671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:15 crc kubenswrapper[4713]: I0308 00:10:15.931007 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.031665 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.031704 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.032058 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.067718 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:16 crc kubenswrapper[4713]: I0308 00:10:16.454584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:20 crc kubenswrapper[4713]: I0308 00:10:20.342264 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:20 crc kubenswrapper[4713]: I0308 00:10:20.342665 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.468319 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.469147 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.475714 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498578 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.498597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.599955 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600023 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.600117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.617142 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:21 crc kubenswrapper[4713]: I0308 00:10:21.788273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.507037 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.507712 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prrdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x6gcb_openshift-marketplace(d9341928-7a63-4190-ac37-ac9ba3320e18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:26 crc kubenswrapper[4713]: E0308 00:10:26.508967 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" Mar 08 00:10:26 crc kubenswrapper[4713]: I0308 00:10:26.614782 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:10:26 crc kubenswrapper[4713]: I0308 00:10:26.641405 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9klvz"] Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.163577 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.163855 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8fx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4tj99_openshift-marketplace(40864d72-e137-478e-8340-8c0f107b4c60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.165043 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.638998 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.639466 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7bjqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x7pkf_openshift-marketplace(c33b42a1-bf95-490f-a907-765855ec81d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.640676 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.759029 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.759181 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t4bc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pd9br_openshift-marketplace(cd4a956b-6edb-436e-bd5e-5d57899c2ea1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.760853 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.915048 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" Mar 08 00:10:29 crc kubenswrapper[4713]: E0308 00:10:29.915079 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" Mar 08 00:10:29 crc kubenswrapper[4713]: W0308 00:10:29.935097 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6470285d_4460_4c72_be17_00e880cc623d.slice/crio-4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2 WatchSource:0}: Error finding container 4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2: Status 404 returned error can't find the container with id 4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2 Mar 08 00:10:29 crc kubenswrapper[4713]: W0308 00:10:29.936996 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02de296b_0485_4f21_abf9_51043545b565.slice/crio-a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf WatchSource:0}: Error finding container a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf: Status 404 returned error can't find the container with id a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.088324 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.088864 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxjck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hs88q_openshift-marketplace(2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.090098 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.341336 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.341405 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.575193 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerStarted","Data":"4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2"} Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.576793 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"a7b0c5b6adeebc1845913460990aee0d46019724eaa06db5f2781d6636cb5ccf"} Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.805567 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 00:10:30 crc kubenswrapper[4713]: I0308 00:10:30.850631 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.973919 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.974165 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" Mar 08 00:10:30 crc kubenswrapper[4713]: E0308 00:10:30.974199 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" Mar 08 00:10:30 crc kubenswrapper[4713]: W0308 00:10:30.992645 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d4ec730_3a6b_4bb3_8878_a3f458fed7a2.slice/crio-e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7 WatchSource:0}: Error finding container e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7: Status 404 returned error can't find the container with id e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7 Mar 08 00:10:30 crc kubenswrapper[4713]: W0308 00:10:30.997403 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddc51fa12_ec6c_48ee_8fd5_55388414d54f.slice/crio-d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df WatchSource:0}: Error finding container d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df: Status 404 returned error can't find the container with id d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644243 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z4s84" event={"ID":"62cfca3e-2ad8-4964-bd9a-5f907f09ca1e","Type":"ContainerStarted","Data":"6c825c4961943cf83a347e73d9455b846b95d6105e56a08a5541dea0e250734c"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644563 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.644959 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.645015 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646276 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerStarted","Data":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646297 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" containerID="cri-o://2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" gracePeriod=30 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.646406 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerStarted","Data":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649174 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" containerID="cri-o://0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" gracePeriod=30 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.649242 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.652994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerStarted","Data":"e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.654622 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208" exitCode=0 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.654691 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.655644 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerStarted","Data":"d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.658440 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb" exitCode=0 Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.658527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.661462 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerStarted","Data":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.685931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.701716 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" podStartSLOduration=49.701696821 podStartE2EDuration="49.701696821s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:31.697966483 +0000 UTC m=+285.817598716" watchObservedRunningTime="2026-03-08 00:10:31.701696821 +0000 UTC m=+285.821329064" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.729764 4713 patch_prober.go:28] interesting pod/route-controller-manager-857fc9cd49-86dkp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:32974->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.729806 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:32974->10.217.0.57:8443: read: connection reset by peer" Mar 08 00:10:31 crc kubenswrapper[4713]: I0308 00:10:31.730043 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" podStartSLOduration=49.730034949 podStartE2EDuration="49.730034949s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:31.714299884 +0000 UTC m=+285.833932117" watchObservedRunningTime="2026-03-08 00:10:31.730034949 +0000 UTC m=+285.849667182" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.153008 4713 csr.go:261] certificate signing request csr-8g47m is approved, waiting to be issued Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.160490 4713 csr.go:257] certificate signing request csr-8g47m is issued Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.565265 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-857fc9cd49-86dkp_74518133-92a1-4cb0-bcb9-85ce78bb2c1f/route-controller-manager/0.log" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.565868 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.570682 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.602860 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.609526 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.609632 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.609651 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.609664 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.610518 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerName="route-controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.610539 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerName="controller-manager" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.611709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.639343 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658688 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658779 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658812 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.658898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") pod \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\" (UID: \"74518133-92a1-4cb0-bcb9-85ce78bb2c1f\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.660309 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca" (OuterVolumeSpecName: "client-ca") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.660427 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config" (OuterVolumeSpecName: "config") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.664879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b" (OuterVolumeSpecName: "kube-api-access-5jg8b") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "kube-api-access-5jg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.664971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "74518133-92a1-4cb0-bcb9-85ce78bb2c1f" (UID: "74518133-92a1-4cb0-bcb9-85ce78bb2c1f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.666700 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerStarted","Data":"b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.668574 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.668633 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669873 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-857fc9cd49-86dkp_74518133-92a1-4cb0-bcb9-85ce78bb2c1f/route-controller-manager/0.log" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669901 4713 generic.go:334] "Generic (PLEG): container finished" podID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" exitCode=255 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.669965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerDied","Data":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670051 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp" event={"ID":"74518133-92a1-4cb0-bcb9-85ce78bb2c1f","Type":"ContainerDied","Data":"409ade3b4669dbf5f8873e64f32cc4c3239e1b04d6422acbe8d91847c500cbde"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.670065 4713 scope.go:117] "RemoveContainer" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.673450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"8d66e38ca3acbd10e7fd1bbbfa3f7735eac5a6a0db2471c93d80fc8e73e19ae2"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.673512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9klvz" event={"ID":"02de296b-0485-4f21-abf9-51043545b565","Type":"ContainerStarted","Data":"6174fac062b15063d6f4a7cb7e5e9cc9fcde6c4007b95d3fe1884f1c0485c85d"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.675871 4713 generic.go:334] "Generic (PLEG): container finished" podID="fdccd72c-79d7-4388-926e-0539c571dafe" containerID="11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.676000 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerDied","Data":"11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677800 4713 generic.go:334] "Generic (PLEG): container finished" podID="abef8d7b-3e23-43e9-96d4-3227bcc16048" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.677892 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerDied","Data":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.679637 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8" event={"ID":"abef8d7b-3e23-43e9-96d4-3227bcc16048","Type":"ContainerDied","Data":"f276e2b1a7d3ec5d946c0b825a48087cfddd233e9465ddce823aae24d96aed33"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.685797 4713 generic.go:334] "Generic (PLEG): container finished" podID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerID="3d57ce672ca7a4417b25b823232a1b0087d96c80347a2c4c027d8db9eed30aa7" exitCode=0 Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686311 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerDied","Data":"3d57ce672ca7a4417b25b823232a1b0087d96c80347a2c4c027d8db9eed30aa7"} Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686763 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686778 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.686807 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.688595 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=11.688581918 podStartE2EDuration="11.688581918s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:32.685050665 +0000 UTC m=+286.804682898" watchObservedRunningTime="2026-03-08 00:10:32.688581918 +0000 UTC m=+286.808214161" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.705913 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.708190 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857fc9cd49-86dkp"] Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.754026 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" podStartSLOduration=214.754008405 podStartE2EDuration="3m34.754008405s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:32.750557014 +0000 UTC m=+286.870189277" watchObservedRunningTime="2026-03-08 00:10:32.754008405 +0000 UTC m=+286.873640638" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760023 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760087 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760220 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760279 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") pod \"abef8d7b-3e23-43e9-96d4-3227bcc16048\" (UID: \"abef8d7b-3e23-43e9-96d4-3227bcc16048\") " Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760505 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.760560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761210 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761272 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761386 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761620 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761650 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jg8b\" (UniqueName: \"kubernetes.io/projected/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-kube-api-access-5jg8b\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.761664 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74518133-92a1-4cb0-bcb9-85ce78bb2c1f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca" (OuterVolumeSpecName: "client-ca") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.762619 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config" (OuterVolumeSpecName: "config") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.763030 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j" (OuterVolumeSpecName: "kube-api-access-g5w5j") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "kube-api-access-g5w5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.763061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "abef8d7b-3e23-43e9-96d4-3227bcc16048" (UID: "abef8d7b-3e23-43e9-96d4-3227bcc16048"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862585 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862880 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.862972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863021 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5w5j\" (UniqueName: \"kubernetes.io/projected/abef8d7b-3e23-43e9-96d4-3227bcc16048-kube-api-access-g5w5j\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863031 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863039 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abef8d7b-3e23-43e9-96d4-3227bcc16048-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863048 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.863056 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abef8d7b-3e23-43e9-96d4-3227bcc16048-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.864069 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.864372 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.866240 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.884562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"route-controller-manager-6b94cbf9d6-j2rxl\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.886793 4713 scope.go:117] "RemoveContainer" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: E0308 00:10:32.889088 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": container with ID starting with 2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e not found: ID does not exist" containerID="2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.889135 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e"} err="failed to get container status \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": rpc error: code = NotFound desc = could not find container \"2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e\": container with ID starting with 2be52eb7ab64193d806b01369127f24c2cc2e879c0591f39fd96e17a48caa66e not found: ID does not exist" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.889167 4713 scope.go:117] "RemoveContainer" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:32 crc kubenswrapper[4713]: I0308 00:10:32.933964 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.025288 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.025344 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c6f4b84f7-f59s8"] Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.081677 4713 scope.go:117] "RemoveContainer" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:33 crc kubenswrapper[4713]: E0308 00:10:33.082225 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": container with ID starting with 0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338 not found: ID does not exist" containerID="0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.082265 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338"} err="failed to get container status \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": rpc error: code = NotFound desc = could not find container \"0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338\": container with ID starting with 0d91eba4ab098027c3901709dbaf3407c532160b9470b4400dd143735aa2d338 not found: ID does not exist" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.162993 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 13:16:59.715685512 +0000 UTC Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.163035 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6469h6m26.552653422s for next certificate rotation Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.919145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9klvz" podStartSLOduration=215.919122045 podStartE2EDuration="3m35.919122045s" podCreationTimestamp="2026-03-08 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:33.712159873 +0000 UTC m=+287.831792126" watchObservedRunningTime="2026-03-08 00:10:33.919122045 +0000 UTC m=+288.038754278" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.923492 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:10:33 crc kubenswrapper[4713]: W0308 00:10:33.942082 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7daca87e_5103_46bd_b6ae_7643c66a4fbc.slice/crio-ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7 WatchSource:0}: Error finding container ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7: Status 404 returned error can't find the container with id ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7 Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.945889 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:33 crc kubenswrapper[4713]: I0308 00:10:33.973986 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111013 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") pod \"fdccd72c-79d7-4388-926e-0539c571dafe\" (UID: \"fdccd72c-79d7-4388-926e-0539c571dafe\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") pod \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111200 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") pod \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\" (UID: \"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2\") " Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111323 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" (UID: "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.111542 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.116368 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff" (OuterVolumeSpecName: "kube-api-access-hrkff") pod "fdccd72c-79d7-4388-926e-0539c571dafe" (UID: "fdccd72c-79d7-4388-926e-0539c571dafe"). InnerVolumeSpecName "kube-api-access-hrkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.116417 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" (UID: "4d4ec730-3a6b-4bb3-8878-a3f458fed7a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.212850 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrkff\" (UniqueName: \"kubernetes.io/projected/fdccd72c-79d7-4388-926e-0539c571dafe-kube-api-access-hrkff\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.213182 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d4ec730-3a6b-4bb3-8878-a3f458fed7a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502303 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502359 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502399 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502881 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.502940 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" gracePeriod=600 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.570700 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74518133-92a1-4cb0-bcb9-85ce78bb2c1f" path="/var/lib/kubelet/pods/74518133-92a1-4cb0-bcb9-85ce78bb2c1f/volumes" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.571595 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abef8d7b-3e23-43e9-96d4-3227bcc16048" path="/var/lib/kubelet/pods/abef8d7b-3e23-43e9-96d4-3227bcc16048/volumes" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702607 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4d4ec730-3a6b-4bb3-8878-a3f458fed7a2","Type":"ContainerDied","Data":"e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702646 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e2e6a429ef142ccdb208a757e7c7f167926e39716a378a021d8f8203cc62e7" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.702722 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.706763 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" exitCode=0 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.706865 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.710089 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerStarted","Data":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712258 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548808-nd57l" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548808-nd57l" event={"ID":"fdccd72c-79d7-4388-926e-0539c571dafe","Type":"ContainerDied","Data":"0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.712718 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af707d82a061d622eec317592ad4179a6046c0ac5a6b6a6071ecbfdd53ddeaa" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.715026 4713 generic.go:334] "Generic (PLEG): container finished" podID="6470285d-4460-4c72-be17-00e880cc623d" containerID="1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1" exitCode=0 Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.715083 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerDied","Data":"1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718230 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerStarted","Data":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718284 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerStarted","Data":"ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7"} Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.718495 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.731697 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.739309 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5hssk" podStartSLOduration=7.277833377 podStartE2EDuration="52.739289061s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="2026-03-08 00:09:48.810052202 +0000 UTC m=+242.929684435" lastFinishedPulling="2026-03-08 00:10:34.271507886 +0000 UTC m=+288.391140119" observedRunningTime="2026-03-08 00:10:34.737472543 +0000 UTC m=+288.857104786" watchObservedRunningTime="2026-03-08 00:10:34.739289061 +0000 UTC m=+288.858921294" Mar 08 00:10:34 crc kubenswrapper[4713]: I0308 00:10:34.771612 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" podStartSLOduration=13.771593124 podStartE2EDuration="13.771593124s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:34.770721431 +0000 UTC m=+288.890353674" watchObservedRunningTime="2026-03-08 00:10:34.771593124 +0000 UTC m=+288.891225357" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352331 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: E0308 00:10:35.352901 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352915 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: E0308 00:10:35.352928 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.352938 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353054 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4ec730-3a6b-4bb3-8878-a3f458fed7a2" containerName="pruner" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353068 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" containerName="oc" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.353477 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.355205 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.355532 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.356620 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.361173 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.361366 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.363539 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.364535 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.366766 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435736 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.435993 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.436011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.536933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.536992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537033 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.537058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.538518 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.538527 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.539040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.547115 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.552758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"controller-manager-b59c8fc9c-nklnq\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.678510 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.911384 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.923430 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:35 crc kubenswrapper[4713]: W0308 00:10:35.925528 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58583d53_0add_4758_8d8b_c309a79b4c48.slice/crio-bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98 WatchSource:0}: Error finding container bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98: Status 404 returned error can't find the container with id bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98 Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.942392 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") pod \"6470285d-4460-4c72-be17-00e880cc623d\" (UID: \"6470285d-4460-4c72-be17-00e880cc623d\") " Mar 08 00:10:35 crc kubenswrapper[4713]: I0308 00:10:35.950606 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh" (OuterVolumeSpecName: "kube-api-access-dv9nh") pod "6470285d-4460-4c72-be17-00e880cc623d" (UID: "6470285d-4460-4c72-be17-00e880cc623d"). InnerVolumeSpecName "kube-api-access-dv9nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.043978 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9nh\" (UniqueName: \"kubernetes.io/projected/6470285d-4460-4c72-be17-00e880cc623d-kube-api-access-dv9nh\") on node \"crc\" DevicePath \"\"" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" event={"ID":"6470285d-4460-4c72-be17-00e880cc623d","Type":"ContainerDied","Data":"4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2"} Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731401 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548810-lnmdz" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.731415 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7523e0406dedf70f87c204d810a583910f394e92876f1ad63424e8210147d2" Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.736995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} Mar 08 00:10:36 crc kubenswrapper[4713]: I0308 00:10:36.739122 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerStarted","Data":"bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98"} Mar 08 00:10:37 crc kubenswrapper[4713]: I0308 00:10:37.745435 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerStarted","Data":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} Mar 08 00:10:37 crc kubenswrapper[4713]: I0308 00:10:37.761780 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" podStartSLOduration=16.761751213 podStartE2EDuration="16.761751213s" podCreationTimestamp="2026-03-08 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:10:37.75902055 +0000 UTC m=+291.878652783" watchObservedRunningTime="2026-03-08 00:10:37.761751213 +0000 UTC m=+291.881383486" Mar 08 00:10:38 crc kubenswrapper[4713]: I0308 00:10:38.750992 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:38 crc kubenswrapper[4713]: I0308 00:10:38.756237 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.342817 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343132 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343369 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-z4s84 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 08 00:10:40 crc kubenswrapper[4713]: I0308 00:10:40.343425 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z4s84" podUID="62cfca3e-2ad8-4964-bd9a-5f907f09ca1e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 08 00:10:42 crc kubenswrapper[4713]: I0308 00:10:42.817436 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:42 crc kubenswrapper[4713]: I0308 00:10:42.817987 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:43 crc kubenswrapper[4713]: I0308 00:10:43.433632 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:43 crc kubenswrapper[4713]: I0308 00:10:43.824209 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.066281 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.066991 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmk7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rdgpc_openshift-marketplace(dcde95f7-8814-4319-8a48-6d186de5f51f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.068145 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" Mar 08 00:10:47 crc kubenswrapper[4713]: E0308 00:10:47.808110 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.236936 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.237645 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kfdss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-57pjt_openshift-marketplace(e23a30a2-2bf8-451e-b85b-b293e8949e9e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.239890 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" Mar 08 00:10:48 crc kubenswrapper[4713]: E0308 00:10:48.813564 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.358749 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-z4s84" Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.823281 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.826637 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.832753 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.835404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603"} Mar 08 00:10:50 crc kubenswrapper[4713]: I0308 00:10:50.837924 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.848207 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.848426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.851256 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.851325 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.854617 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.854701 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.858915 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.859010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603"} Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.866114 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066" exitCode=0 Mar 08 00:10:51 crc kubenswrapper[4713]: I0308 00:10:51.866161 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066"} Mar 08 00:10:55 crc kubenswrapper[4713]: I0308 00:10:55.073986 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:11:01 crc kubenswrapper[4713]: I0308 00:11:01.926432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerStarted","Data":"99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6"} Mar 08 00:11:02 crc kubenswrapper[4713]: I0308 00:11:02.958713 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6gcb" podStartSLOduration=5.154366956 podStartE2EDuration="1m22.958695066s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:42.18781589 +0000 UTC m=+236.307448123" lastFinishedPulling="2026-03-08 00:10:59.992144 +0000 UTC m=+314.111776233" observedRunningTime="2026-03-08 00:11:02.955770679 +0000 UTC m=+317.075402962" watchObservedRunningTime="2026-03-08 00:11:02.958695066 +0000 UTC m=+317.078327299" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.491861 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.492655 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.492680 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.492884 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6470285d-4460-4c72-be17-00e880cc623d" containerName="oc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493336 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493470 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493652 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493723 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493814 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493797 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.493783 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" gracePeriod=15 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494489 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494665 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494684 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494695 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494704 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494714 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494723 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494734 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494742 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494752 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494761 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494778 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494790 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494805 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494816 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.494864 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.494874 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495009 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495019 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495066 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495079 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495093 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495104 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495117 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495131 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.495294 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495304 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: E0308 00:11:09.495316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495325 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.495467 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.536575 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551418 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551467 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551517 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551623 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.551693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.652937 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653155 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653266 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653380 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653616 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653672 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653666 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653698 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653683 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.653808 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.831993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.975540 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.976899 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977653 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977721 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977733 4713 scope.go:117] "RemoveContainer" containerID="5c96bb1af73724115a1b1e98538ddfe6570b62de532cfe90729db839502a1707" Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977740 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" exitCode=0 Mar 08 00:11:09 crc kubenswrapper[4713]: I0308 00:11:09.977750 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" exitCode=2 Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.083719 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.084690 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085258 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085671 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.085971 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: E0308 00:11:10.086001 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.845367 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.845603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.886756 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.887311 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.887803 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.986205 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerID="b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6" exitCode=0 Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.986323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerDied","Data":"b5c6644f13e27288f2154b86d0cb3a5c886ae340b696eaaa05f0b93b6be6c6d6"} Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987072 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987558 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:10 crc kubenswrapper[4713]: I0308 00:11:10.987914 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.026428 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.026973 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.027364 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.027710 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:11 crc kubenswrapper[4713]: E0308 00:11:11.106243 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-hs88q.189ab53ba5568682 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-hs88q,UID:2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0,APIVersion:v1,ResourceVersion:28774,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.254s (19.254s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,LastTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.996721 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:11 crc kubenswrapper[4713]: I0308 00:11:11.998052 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" exitCode=0 Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.500277 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.500894 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.501257 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.501605 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: W0308 00:11:12.550821 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088 WatchSource:0}: Error finding container dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088: Status 404 returned error can't find the container with id dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088 Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.578978 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580132 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580424 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580572 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580702 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.580940 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603449 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") pod \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\" (UID: \"dc51fa12-ec6c-48ee-8fd5-55388414d54f\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.603901 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock" (OuterVolumeSpecName: "var-lock") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.605960 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.605991 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.609503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc51fa12-ec6c-48ee-8fd5-55388414d54f" (UID: "dc51fa12-ec6c-48ee-8fd5-55388414d54f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.706985 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707186 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707332 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707363 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.707615 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708021 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708303 4713 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708752 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc51fa12-ec6c-48ee-8fd5-55388414d54f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708773 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:12 crc kubenswrapper[4713]: I0308 00:11:12.708782 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.012549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerStarted","Data":"e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014000 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014246 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.014557 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.020852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerStarted","Data":"54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.022596 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.023363 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.023854 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024148 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024345 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024486 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024698 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.024852 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027800 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc51fa12-ec6c-48ee-8fd5-55388414d54f","Type":"ContainerDied","Data":"d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027851 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9172293da02dd75281be2c0f6a68b321d4fe6ee21fc35d92d3715acf36901df" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.027948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033008 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"dcf7e359bd80d171b4b13b74a08f0371efc2c48ba7b96293cc536863b0f1e088"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.033625 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.034186 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.037924 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.038297 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.038677 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.039121 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.040329 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerStarted","Data":"a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042025 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.042746 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.043041 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.043499 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.045081 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.046397 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.046588 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.047334 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.047773 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048138 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048373 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.048744 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.052150 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.052228 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.053434 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.055440 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.056928 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057243 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057376 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057298 4713 scope.go:117] "RemoveContainer" containerID="9f4ada86c457e1168fa15663057fa20ffd0ed16f2f5ba9ac2c5a32e3742de2a7" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.057724 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058002 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058223 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058474 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058695 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.058952 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.059177 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.059390 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.075443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerStarted","Data":"023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016"} Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.075527 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.078933 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079215 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079444 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079668 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.079927 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080287 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080503 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.080722 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.081159 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.084987 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.085189 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.085555 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088152 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088281 4713 scope.go:117] "RemoveContainer" containerID="d4d1520c60ff738c9ba2994b7bdda69ba12473e243a6db42d19d385c8169834e" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088683 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.088899 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089058 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089214 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.089415 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.109224 4713 scope.go:117] "RemoveContainer" containerID="ea9a282cc5b0190d398425d97e0d7785380a8ad776e862d47eb627897e069637" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.132480 4713 scope.go:117] "RemoveContainer" containerID="3c81e926fb66874354e2f1315196a247f3a9600ea13a2ae363225f964cc563d7" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.146799 4713 scope.go:117] "RemoveContainer" containerID="830a3288c8cee2baf75634cbf8b29b5a1e93fd85f2f9015935860cfdb29c7bcd" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.164812 4713 scope.go:117] "RemoveContainer" containerID="982004a53f1ffe4be435bd18b7277e42155502af709b8976e148caa6b4211510" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.260532 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:13 crc kubenswrapper[4713]: I0308 00:11:13.260596 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.083270 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a" exitCode=0 Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.083320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a"} Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.084416 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091181 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb" exitCode=0 Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091215 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb"} Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091179 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091474 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.091882 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093099 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093512 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.093868 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094096 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094379 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.094737 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095063 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095297 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095558 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.095755 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.096008 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.096268 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.097565 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.098058 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.099379 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.099698 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.303579 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" probeResult="failure" output=< Mar 08 00:11:14 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:11:14 crc kubenswrapper[4713]: > Mar 08 00:11:14 crc kubenswrapper[4713]: I0308 00:11:14.547058 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.097804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerStarted","Data":"bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269"} Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.098780 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099238 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099561 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.099811 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100078 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100383 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100622 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerStarted","Data":"4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1"} Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.100911 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101181 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101550 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.101930 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102347 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102579 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.102868 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103183 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103447 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103668 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:15 crc kubenswrapper[4713]: I0308 00:11:15.103915 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.543222 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.544350 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.544758 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545101 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545462 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.545797 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546007 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546169 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:16 crc kubenswrapper[4713]: I0308 00:11:16.546415 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:17 crc kubenswrapper[4713]: E0308 00:11:17.352308 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-hs88q.189ab53ba5568682 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-hs88q,UID:2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0,APIVersion:v1,ResourceVersion:28774,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.254s (19.254s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,LastTimestamp:2026-03-08 00:11:11.105320578 +0000 UTC m=+325.224952831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.245338 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.245855 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246361 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246646 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.246949 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:18 crc kubenswrapper[4713]: I0308 00:11:18.246987 4713 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.247209 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.448067 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 08 00:11:18 crc kubenswrapper[4713]: E0308 00:11:18.849543 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 08 00:11:19 crc kubenswrapper[4713]: E0308 00:11:19.650237 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.375662 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T00:11:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376216 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376497 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.376846 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.377104 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:20 crc kubenswrapper[4713]: E0308 00:11:20.377129 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.042339 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.042405 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.080909 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.081463 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.081979 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082219 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082458 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.082788 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083054 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083353 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083622 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.083879 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.176502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.176805 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177058 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177293 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177516 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.177811 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178063 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178266 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178499 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.178725 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.239514 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.239621 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: E0308 00:11:21.251850 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.278519 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.278966 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279287 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279488 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279692 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.279899 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280059 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280279 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280561 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.280817 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.485052 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.485145 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.528397 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529075 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529446 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.529979 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.530667 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.530990 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.531308 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.531890 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.532169 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:21 crc kubenswrapper[4713]: I0308 00:11:21.532500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172205 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172665 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.172856 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173078 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173288 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.173617 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.174069 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.174523 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.175101 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.175362 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.192573 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.193403 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.193895 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194255 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194500 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194735 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.194967 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195182 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195398 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:22 crc kubenswrapper[4713]: I0308 00:11:22.195618 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.142783 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.143959 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.144003 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414" exitCode=1 Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.144113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414"} Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.145850 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146315 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146756 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.146982 4713 scope.go:117] "RemoveContainer" containerID="b889b5cdcdafac4c08a37ddbf65fe6148e451c41914c8963bf50be9c84e84414" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147006 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147295 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147660 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.147944 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148166 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148369 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.148543 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313052 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313644 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.313997 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314219 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314413 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.314676 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315025 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315426 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.315707 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.316024 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.316233 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.349337 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.349893 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350307 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350492 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350642 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350776 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.350938 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351072 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351201 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351330 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.351460 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.540642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.541538 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.541886 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.542499 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.542730 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.545600 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546031 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546248 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546410 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546565 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.546719 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.553457 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.553481 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:23 crc kubenswrapper[4713]: E0308 00:11:23.553903 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.554605 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:23 crc kubenswrapper[4713]: W0308 00:11:23.572521 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac WatchSource:0}: Error finding container b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac: Status 404 returned error can't find the container with id b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac Mar 08 00:11:23 crc kubenswrapper[4713]: I0308 00:11:23.692463 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152067 4713 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ed090832d4b722ebd3fbecf4ff160ef991490ffe56d3217e0d5ae483ae265d9a" exitCode=0 Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152157 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ed090832d4b722ebd3fbecf4ff160ef991490ffe56d3217e0d5ae483ae265d9a"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b26ba76973d9ec22bccc49af00997f36bc34ee4f2fda7a368a5405af52001fac"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152737 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.152761 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:24 crc kubenswrapper[4713]: E0308 00:11:24.153147 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.153165 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.153686 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154118 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154465 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.154764 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.155109 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.155563 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156037 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156307 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156439 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.156762 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.157470 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.157564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc756539acbbd6016530861f0ca3f1b19c51ce9445da649b72e4dbdfb56cf2b7"} Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.158599 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159058 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159519 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.159906 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.160454 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.160933 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.161347 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.161847 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.162300 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.162880 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.237719 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.237776 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.281703 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.282125 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.283094 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.283959 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.284410 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.284907 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.285313 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.285610 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.286058 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.286603 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.287142 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: E0308 00:11:24.453132 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.692713 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.692782 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.729255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.729856 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.730567 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731027 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731359 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731661 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.731964 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732235 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732534 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.732888 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:24 crc kubenswrapper[4713]: I0308 00:11:24.733160 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.203088 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.203854 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.204330 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.204778 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205078 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205309 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205596 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.205800 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206003 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206182 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.206389 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209752 4713 status_manager.go:851] "Failed to get status for pod" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" pod="openshift-marketplace/community-operators-pd9br" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pd9br\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.209968 4713 status_manager.go:851] "Failed to get status for pod" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" pod="openshift-marketplace/redhat-operators-57pjt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-57pjt\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210227 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210569 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.210804 4713 status_manager.go:851] "Failed to get status for pod" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211047 4713 status_manager.go:851] "Failed to get status for pod" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" pod="openshift-marketplace/redhat-marketplace-hs88q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hs88q\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211259 4713 status_manager.go:851] "Failed to get status for pod" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" pod="openshift-marketplace/redhat-operators-rdgpc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rdgpc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211476 4713 status_manager.go:851] "Failed to get status for pod" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" pod="openshift-marketplace/certified-operators-x6gcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x6gcb\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211685 4713 status_manager.go:851] "Failed to get status for pod" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" pod="openshift-marketplace/certified-operators-x7pkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-x7pkf\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:25 crc kubenswrapper[4713]: I0308 00:11:25.211972 4713 status_manager.go:851] "Failed to get status for pod" podUID="40864d72-e137-478e-8340-8c0f107b4c60" pod="openshift-marketplace/community-operators-4tj99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4tj99\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 08 00:11:26 crc kubenswrapper[4713]: I0308 00:11:26.169649 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1cd7e25d02054b293f534ff2e47e1f55bee990db4d8ab079e3a609f0ad8ebcdf"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.178794 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42356eae2569e4cdceef545ada9e3f57b0018356b39cd47ad055a4dfb933acc9"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.179938 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180059 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81d9f67cad6662a85e214fa9b3812349ae8adddb5bebd4bd202c6f33e7b6be24"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180220 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4ebf0f92d1e7564cc3acf1efa9ad3009b0cd48b1b1a27c985a0e02a8a3b19b4"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ded44558bde8bbf893974ab43495d67acd5c3f360394bd11d2a4a5a3eccce799"} Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.179297 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:27 crc kubenswrapper[4713]: I0308 00:11:27.180530 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.555298 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.555344 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:28 crc kubenswrapper[4713]: I0308 00:11:28.560452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.626755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.627095 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.629164 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.629959 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.638864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.643503 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:29 crc kubenswrapper[4713]: I0308 00:11:29.650497 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 00:11:30 crc kubenswrapper[4713]: W0308 00:11:30.098411 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead WatchSource:0}: Error finding container d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead: Status 404 returned error can't find the container with id d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead Mar 08 00:11:30 crc kubenswrapper[4713]: I0308 00:11:30.198304 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d60597421c14a2c2522f0eb569437438b3518aeead10cf41acd7da94682afead"} Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.005642 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.012604 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.206597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"795b7cfb0b5268f531cd919ce190748af9e0691f8457e2d8607f31d4374958cd"} Mar 08 00:11:31 crc kubenswrapper[4713]: I0308 00:11:31.207007 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:32 crc kubenswrapper[4713]: I0308 00:11:32.187593 4713 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.215415 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.215447 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.218940 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:33 crc kubenswrapper[4713]: I0308 00:11:33.223964 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7d8bb7b8-9c57-40e6-90fc-52441b10732b" Mar 08 00:11:34 crc kubenswrapper[4713]: I0308 00:11:34.219876 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:34 crc kubenswrapper[4713]: I0308 00:11:34.219902 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="160301c9-6c5f-40f1-a40f-a0498b367a6e" Mar 08 00:11:36 crc kubenswrapper[4713]: I0308 00:11:36.562808 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7d8bb7b8-9c57-40e6-90fc-52441b10732b" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.370226 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.422324 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 00:11:41 crc kubenswrapper[4713]: I0308 00:11:41.717944 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.029618 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.141524 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.476303 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.529947 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 00:11:42 crc kubenswrapper[4713]: I0308 00:11:42.673415 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.209770 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.400969 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.495281 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.519089 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.673512 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.689259 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.723290 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.800700 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.895122 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 00:11:43 crc kubenswrapper[4713]: I0308 00:11:43.960947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.049195 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.066672 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.376440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.638407 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.639630 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.870523 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.877766 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.908160 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.947923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 00:11:44 crc kubenswrapper[4713]: I0308 00:11:44.977121 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.080435 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.149017 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.173412 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.211414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.217239 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.314182 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.372120 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.419976 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.427053 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.469714 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.539144 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.546621 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.581980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.608153 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.636636 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.703188 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.742210 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.869629 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.894568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.952577 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.958360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 00:11:45 crc kubenswrapper[4713]: I0308 00:11:45.986984 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.065650 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.068365 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.068856 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.157606 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.177272 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.177985 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.343740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.434875 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.452229 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.488657 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.495042 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.524950 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.535331 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.584010 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.596141 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.647914 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.671158 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.671654 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.672048 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pd9br" podStartSLOduration=36.546779217 podStartE2EDuration="2m5.672032273s" podCreationTimestamp="2026-03-08 00:09:41 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.284892235 +0000 UTC m=+237.404524468" lastFinishedPulling="2026-03-08 00:11:12.410145281 +0000 UTC m=+326.529777524" observedRunningTime="2026-03-08 00:11:31.955153349 +0000 UTC m=+346.074785582" watchObservedRunningTime="2026-03-08 00:11:46.672032273 +0000 UTC m=+360.791664516" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674037 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.67402798 podStartE2EDuration="37.67402798s" podCreationTimestamp="2026-03-08 00:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:11:31.823721772 +0000 UTC m=+345.943354005" watchObservedRunningTime="2026-03-08 00:11:46.67402798 +0000 UTC m=+360.793660233" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674350 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-57pjt" podStartSLOduration=80.847170258 podStartE2EDuration="2m3.67434513s" podCreationTimestamp="2026-03-08 00:09:43 +0000 UTC" firstStartedPulling="2026-03-08 00:10:31.656635352 +0000 UTC m=+285.776267585" lastFinishedPulling="2026-03-08 00:11:14.483810224 +0000 UTC m=+328.603442457" observedRunningTime="2026-03-08 00:11:31.970373469 +0000 UTC m=+346.090005722" watchObservedRunningTime="2026-03-08 00:11:46.67434513 +0000 UTC m=+360.793977363" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674546 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rdgpc" podStartSLOduration=79.538307035 podStartE2EDuration="2m2.674541025s" podCreationTimestamp="2026-03-08 00:09:44 +0000 UTC" firstStartedPulling="2026-03-08 00:10:31.659743604 +0000 UTC m=+285.779375837" lastFinishedPulling="2026-03-08 00:11:14.795977594 +0000 UTC m=+328.915609827" observedRunningTime="2026-03-08 00:11:31.888505054 +0000 UTC m=+346.008137287" watchObservedRunningTime="2026-03-08 00:11:46.674541025 +0000 UTC m=+360.794173268" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674808 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x7pkf" podStartSLOduration=38.755281452 podStartE2EDuration="2m6.674803793s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.24727236 +0000 UTC m=+237.366904593" lastFinishedPulling="2026-03-08 00:11:11.166794701 +0000 UTC m=+325.286426934" observedRunningTime="2026-03-08 00:11:31.912757865 +0000 UTC m=+346.032390098" watchObservedRunningTime="2026-03-08 00:11:46.674803793 +0000 UTC m=+360.794436026" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.674921 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4tj99" podStartSLOduration=37.409628292 podStartE2EDuration="2m6.674917796s" podCreationTimestamp="2026-03-08 00:09:40 +0000 UTC" firstStartedPulling="2026-03-08 00:09:43.22698253 +0000 UTC m=+237.346614763" lastFinishedPulling="2026-03-08 00:11:12.492272034 +0000 UTC m=+326.611904267" observedRunningTime="2026-03-08 00:11:31.941523755 +0000 UTC m=+346.061155988" watchObservedRunningTime="2026-03-08 00:11:46.674917796 +0000 UTC m=+360.794550039" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.677907 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hs88q" podStartSLOduration=42.383095358 podStartE2EDuration="2m4.677896312s" podCreationTimestamp="2026-03-08 00:09:42 +0000 UTC" firstStartedPulling="2026-03-08 00:09:48.810456852 +0000 UTC m=+242.930089085" lastFinishedPulling="2026-03-08 00:11:11.105257786 +0000 UTC m=+325.224890039" observedRunningTime="2026-03-08 00:11:31.875313853 +0000 UTC m=+345.994946086" watchObservedRunningTime="2026-03-08 00:11:46.677896312 +0000 UTC m=+360.797528565" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.678718 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.678762 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.682502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.718433 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.718411573000001 podStartE2EDuration="14.718411573s" podCreationTimestamp="2026-03-08 00:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:11:46.697419586 +0000 UTC m=+360.817051839" watchObservedRunningTime="2026-03-08 00:11:46.718411573 +0000 UTC m=+360.838043806" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.746580 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.784277 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.864542 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.864765 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.900740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.917490 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 00:11:46 crc kubenswrapper[4713]: I0308 00:11:46.989852 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.050100 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.155288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.378338 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.380560 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.569328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.669156 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.674365 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.681233 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.691416 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 00:11:47 crc kubenswrapper[4713]: I0308 00:11:47.896622 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.200495 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.233204 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.392636 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.400752 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.413357 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.457935 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.482914 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.518062 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.522101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.594488 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.599893 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.654177 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.748437 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.846664 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.982670 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.993397 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 00:11:48 crc kubenswrapper[4713]: I0308 00:11:48.995678 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.083761 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.123068 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.158962 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.163860 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.266173 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.388591 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.452991 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.613447 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.660718 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.679474 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.807582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.927436 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.932058 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.932931 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 00:11:49 crc kubenswrapper[4713]: I0308 00:11:49.956766 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.038493 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.177768 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.197275 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.252511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.296184 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.416507 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.575916 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.621719 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.635716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.695714 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.703861 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.716065 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.738907 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.755260 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.804655 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.899487 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.969779 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.983053 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 00:11:50 crc kubenswrapper[4713]: I0308 00:11:50.993309 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.044879 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.049769 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.054177 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.125882 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.138552 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.212732 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.327600 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.371212 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.516162 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.534379 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.572240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.581428 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.622228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.687276 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.716381 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 00:11:51 crc kubenswrapper[4713]: I0308 00:11:51.733237 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.053420 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.194595 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.225252 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.258593 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.283030 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.346121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.398451 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.441808 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.478929 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.519724 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.546314 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.547805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.554380 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.597073 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.602551 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.620415 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.672788 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.728463 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.783882 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.784547 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.803678 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.857275 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.886520 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 00:11:52 crc kubenswrapper[4713]: I0308 00:11:52.980057 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.149311 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.197211 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.297360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.317219 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.332108 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.469150 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.480469 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.516436 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.607704 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.613800 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.641570 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.646857 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.867302 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.878758 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.898112 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.939574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 00:11:53 crc kubenswrapper[4713]: I0308 00:11:53.970172 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.003492 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.007370 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.025212 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.061388 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.154807 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.333805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.343242 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.390375 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.432725 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.432950 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" gracePeriod=5 Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.553977 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.620264 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.709988 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.783922 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.796418 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.810645 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.970403 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.984150 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:11:54 crc kubenswrapper[4713]: I0308 00:11:54.994565 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.032410 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.062567 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.083648 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.237929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.241916 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.333727 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.356771 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.361748 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.364274 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.418228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.472175 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.653368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.694731 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.849447 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.852728 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.876100 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.917391 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 00:11:55 crc kubenswrapper[4713]: I0308 00:11:55.994725 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.010919 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.048716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.058656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.153110 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.170849 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.218978 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.227274 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.334970 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.433659 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.496040 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.515457 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 00:11:56 crc kubenswrapper[4713]: I0308 00:11:56.778944 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.047685 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.064440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.375546 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.455119 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.476723 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 00:11:57 crc kubenswrapper[4713]: I0308 00:11:57.644025 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.025130 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.222984 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 00:11:58 crc kubenswrapper[4713]: I0308 00:11:58.695993 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.038525 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.038965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168409 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.168867 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168901 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.168938 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.168958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.169194 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc51fa12-ec6c-48ee-8fd5-55388414d54f" containerName="installer" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.169233 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.170208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.173035 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.174386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.174406 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.177570 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205523 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205618 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205658 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.205727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206177 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206243 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206281 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.206624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.215431 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.306944 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307281 4713 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307371 4713 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307458 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307529 4713 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.307602 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.359852 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.359900 4713 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" exitCode=137 Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.360034 4713 scope.go:117] "RemoveContainer" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.360046 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.384589 4713 scope.go:117] "RemoveContainer" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: E0308 00:12:00.385004 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": container with ID starting with ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293 not found: ID does not exist" containerID="ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.385051 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293"} err="failed to get container status \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": rpc error: code = NotFound desc = could not find container \"ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293\": container with ID starting with ee950c82c71f89197c3fdd129495b9b1ccc432ef6fac280107d19124be838293 not found: ID does not exist" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.408809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.429883 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"auto-csr-approver-29548812-24fjw\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.499338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.547302 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.547579 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.558085 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.558128 4713 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b60054b-377a-42aa-a77b-1946ed626065" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.569739 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.569789 4713 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b60054b-377a-42aa-a77b-1946ed626065" Mar 08 00:12:00 crc kubenswrapper[4713]: I0308 00:12:00.891630 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.366170 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerStarted","Data":"4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546"} Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.402088 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.402336 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" containerID="cri-o://238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" gracePeriod=30 Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.496488 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.497079 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" containerID="cri-o://60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" gracePeriod=30 Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.748193 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.854557 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927585 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927662 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927684 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.927730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") pod \"58583d53-0add-4758-8d8b-c309a79b4c48\" (UID: \"58583d53-0add-4758-8d8b-c309a79b4c48\") " Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.928730 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config" (OuterVolumeSpecName: "config") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.928726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929077 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929108 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.929264 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca" (OuterVolumeSpecName: "client-ca") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.932710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:01 crc kubenswrapper[4713]: I0308 00:12:01.932812 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp" (OuterVolumeSpecName: "kube-api-access-48rlp") pod "58583d53-0add-4758-8d8b-c309a79b4c48" (UID: "58583d53-0add-4758-8d8b-c309a79b4c48"). InnerVolumeSpecName "kube-api-access-48rlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029957 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.029992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") pod \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\" (UID: \"7daca87e-5103-46bd-b6ae-7643c66a4fbc\") " Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca" (OuterVolumeSpecName: "client-ca") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030722 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58583d53-0add-4758-8d8b-c309a79b4c48-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030764 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48rlp\" (UniqueName: \"kubernetes.io/projected/58583d53-0add-4758-8d8b-c309a79b4c48-kube-api-access-48rlp\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030778 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58583d53-0add-4758-8d8b-c309a79b4c48-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.030880 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config" (OuterVolumeSpecName: "config") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.032732 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x" (OuterVolumeSpecName: "kube-api-access-zjt7x") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "kube-api-access-zjt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.032974 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7daca87e-5103-46bd-b6ae-7643c66a4fbc" (UID: "7daca87e-5103-46bd-b6ae-7643c66a4fbc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132188 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjt7x\" (UniqueName: \"kubernetes.io/projected/7daca87e-5103-46bd-b6ae-7643c66a4fbc-kube-api-access-zjt7x\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132218 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132227 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7daca87e-5103-46bd-b6ae-7643c66a4fbc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.132237 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daca87e-5103-46bd-b6ae-7643c66a4fbc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.372440 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerStarted","Data":"71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.373952 4713 generic.go:334] "Generic (PLEG): container finished" podID="58583d53-0add-4758-8d8b-c309a79b4c48" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" exitCode=0 Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374016 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerDied","Data":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374036 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" event={"ID":"58583d53-0add-4758-8d8b-c309a79b4c48","Type":"ContainerDied","Data":"bf14b4768a06207e44a9e2b8f817f874dac0b317715a2c1cef7640a7a7b1ee98"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374043 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c8fc9c-nklnq" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.374056 4713 scope.go:117] "RemoveContainer" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376052 4713 generic.go:334] "Generic (PLEG): container finished" podID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" exitCode=0 Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376085 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerDied","Data":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" event={"ID":"7daca87e-5103-46bd-b6ae-7643c66a4fbc","Type":"ContainerDied","Data":"ca8e90ef695a32802124e9aceef3123bdb89dbe43217f030e702dfd71adfbdc7"} Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.376139 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.389263 4713 scope.go:117] "RemoveContainer" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: E0308 00:12:02.392392 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": container with ID starting with 238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1 not found: ID does not exist" containerID="238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.392480 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1"} err="failed to get container status \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": rpc error: code = NotFound desc = could not find container \"238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1\": container with ID starting with 238939e0ac613a93c7f81361efaa248cfbfc00a216328355e01173bb9d45efb1 not found: ID does not exist" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.392520 4713 scope.go:117] "RemoveContainer" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.400473 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548812-24fjw" podStartSLOduration=1.30879019 podStartE2EDuration="2.400454991s" podCreationTimestamp="2026-03-08 00:12:00 +0000 UTC" firstStartedPulling="2026-03-08 00:12:00.90143606 +0000 UTC m=+375.021068323" lastFinishedPulling="2026-03-08 00:12:01.993100891 +0000 UTC m=+376.112733124" observedRunningTime="2026-03-08 00:12:02.391896804 +0000 UTC m=+376.511529057" watchObservedRunningTime="2026-03-08 00:12:02.400454991 +0000 UTC m=+376.520087234" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.408194 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.411193 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b94cbf9d6-j2rxl"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414191 4713 scope.go:117] "RemoveContainer" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414525 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:02 crc kubenswrapper[4713]: E0308 00:12:02.414586 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": container with ID starting with 60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3 not found: ID does not exist" containerID="60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.414612 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3"} err="failed to get container status \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": rpc error: code = NotFound desc = could not find container \"60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3\": container with ID starting with 60b716d027634d1d9bfd56752b1e12c7b7eb837d727fb4d3708bc8b18f7698a3 not found: ID does not exist" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.417528 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b59c8fc9c-nklnq"] Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.547661 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" path="/var/lib/kubelet/pods/58583d53-0add-4758-8d8b-c309a79b4c48/volumes" Mar 08 00:12:02 crc kubenswrapper[4713]: I0308 00:12:02.549653 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" path="/var/lib/kubelet/pods/7daca87e-5103-46bd-b6ae-7643c66a4fbc/volumes" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.387751 4713 generic.go:334] "Generic (PLEG): container finished" podID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerID="71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be" exitCode=0 Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.387818 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerDied","Data":"71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be"} Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414294 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:03 crc kubenswrapper[4713]: E0308 00:12:03.414698 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414725 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: E0308 00:12:03.414761 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414773 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414950 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daca87e-5103-46bd-b6ae-7643c66a4fbc" containerName="route-controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.414969 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="58583d53-0add-4758-8d8b-c309a79b4c48" containerName="controller-manager" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.415415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.420421 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.421897 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.422018 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.422413 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.424669 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.426278 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.435083 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.444023 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.444960 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448257 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448556 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448605 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448685 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.448576 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449411 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449553 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449594 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449702 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.449853 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.453276 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.465959 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.472032 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551797 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551901 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.551954 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552102 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552222 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.552270 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553112 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553921 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.553966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.554085 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.556392 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.556547 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.557360 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.567201 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"route-controller-manager-6575bb6f8c-p6445\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.575734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"controller-manager-795f4d9bc7-g9wgf\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.754471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:03 crc kubenswrapper[4713]: I0308 00:12:03.766676 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.182559 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:04 crc kubenswrapper[4713]: W0308 00:12:04.188185 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a0c57d_18d7_440f_aa59_4a55988fcd25.slice/crio-d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b WatchSource:0}: Error finding container d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b: Status 404 returned error can't find the container with id d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.235292 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:04 crc kubenswrapper[4713]: W0308 00:12:04.240289 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b526e8_eda3_4eaf_b5ed_15ed74c51d76.slice/crio-00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0 WatchSource:0}: Error finding container 00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0: Status 404 returned error can't find the container with id 00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0 Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395477 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerStarted","Data":"6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395795 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerStarted","Data":"00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.395815 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397240 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerStarted","Data":"2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397285 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerStarted","Data":"d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b"} Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397631 4713 patch_prober.go:28] interesting pod/controller-manager-795f4d9bc7-g9wgf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.397674 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.414781 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podStartSLOduration=3.414752659 podStartE2EDuration="3.414752659s" podCreationTimestamp="2026-03-08 00:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:04.413684468 +0000 UTC m=+378.533316711" watchObservedRunningTime="2026-03-08 00:12:04.414752659 +0000 UTC m=+378.534384892" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.623786 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.766199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") pod \"12cdabef-a56e-45d2-8896-aab98bd84fb1\" (UID: \"12cdabef-a56e-45d2-8896-aab98bd84fb1\") " Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.772176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9" (OuterVolumeSpecName: "kube-api-access-6ncv9") pod "12cdabef-a56e-45d2-8896-aab98bd84fb1" (UID: "12cdabef-a56e-45d2-8896-aab98bd84fb1"). InnerVolumeSpecName "kube-api-access-6ncv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:04 crc kubenswrapper[4713]: I0308 00:12:04.867660 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ncv9\" (UniqueName: \"kubernetes.io/projected/12cdabef-a56e-45d2-8896-aab98bd84fb1-kube-api-access-6ncv9\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548812-24fjw" event={"ID":"12cdabef-a56e-45d2-8896-aab98bd84fb1","Type":"ContainerDied","Data":"4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546"} Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403607 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f3257c130a12b7f62d39b42bf8c076b22c12811abedce81b9b8ef554ca7f546" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.403479 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548812-24fjw" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.404085 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.407243 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.411910 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:05 crc kubenswrapper[4713]: I0308 00:12:05.427567 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" podStartSLOduration=4.427547132 podStartE2EDuration="4.427547132s" podCreationTimestamp="2026-03-08 00:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:05.421696043 +0000 UTC m=+379.541328286" watchObservedRunningTime="2026-03-08 00:12:05.427547132 +0000 UTC m=+379.547179375" Mar 08 00:12:06 crc kubenswrapper[4713]: I0308 00:12:06.343240 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 00:12:23 crc kubenswrapper[4713]: I0308 00:12:23.813682 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.856894 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.857419 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" containerID="cri-o://6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" gracePeriod=30 Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.870988 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:26 crc kubenswrapper[4713]: I0308 00:12:26.871221 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" containerID="cri-o://2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" gracePeriod=30 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.742989 4713 generic.go:334] "Generic (PLEG): container finished" podID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerID="6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" exitCode=0 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.743071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerDied","Data":"6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f"} Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.745073 4713 generic.go:334] "Generic (PLEG): container finished" podID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerID="2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" exitCode=0 Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.745097 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerDied","Data":"2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6"} Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.939680 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968466 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:27 crc kubenswrapper[4713]: E0308 00:12:27.968700 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968715 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: E0308 00:12:27.968731 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968739 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968880 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" containerName="oc" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.968895 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" containerName="route-controller-manager" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.969246 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:27 crc kubenswrapper[4713]: I0308 00:12:27.980215 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.005252 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.126317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127126 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127293 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127321 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127342 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") pod \"14a0c57d-18d7-440f-aa59-4a55988fcd25\" (UID: \"14a0c57d-18d7-440f-aa59-4a55988fcd25\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127375 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") pod \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\" (UID: \"67b526e8-eda3-4eaf-b5ed-15ed74c51d76\") " Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127529 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127640 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127668 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.127692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128391 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config" (OuterVolumeSpecName: "config") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128464 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca" (OuterVolumeSpecName: "client-ca") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128596 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config" (OuterVolumeSpecName: "config") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.128842 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca" (OuterVolumeSpecName: "client-ca") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132133 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j" (OuterVolumeSpecName: "kube-api-access-z726j") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "kube-api-access-z726j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14a0c57d-18d7-440f-aa59-4a55988fcd25" (UID: "14a0c57d-18d7-440f-aa59-4a55988fcd25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.132370 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk" (OuterVolumeSpecName: "kube-api-access-mslqk") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "kube-api-access-mslqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.134481 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67b526e8-eda3-4eaf-b5ed-15ed74c51d76" (UID: "67b526e8-eda3-4eaf-b5ed-15ed74c51d76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.193568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228739 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228862 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228984 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z726j\" (UniqueName: \"kubernetes.io/projected/14a0c57d-18d7-440f-aa59-4a55988fcd25-kube-api-access-z726j\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.228997 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229008 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229021 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a0c57d-18d7-440f-aa59-4a55988fcd25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229032 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229043 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229054 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a0c57d-18d7-440f-aa59-4a55988fcd25-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229065 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslqk\" (UniqueName: \"kubernetes.io/projected/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-kube-api-access-mslqk\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.229076 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b526e8-eda3-4eaf-b5ed-15ed74c51d76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.230104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.230172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.232127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.244495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"route-controller-manager-67cccf86c6-zhfs5\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.315579 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.723195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751621 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" event={"ID":"67b526e8-eda3-4eaf-b5ed-15ed74c51d76","Type":"ContainerDied","Data":"00559d3f800cb54f8f53c6d7c5f012513908b1826af624111ab30d74222442a0"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751668 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.751671 4713 scope.go:117] "RemoveContainer" containerID="6c658c8b4a03fefe8008b7910e27cb534da06e0671543307b2db80f93874f42f" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.760702 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" event={"ID":"14a0c57d-18d7-440f-aa59-4a55988fcd25","Type":"ContainerDied","Data":"d6f0dd9d549fc0ef306ade913afb52742256dcf11ce9922697ee66b5e4b3851b"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.760817 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.765116 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerStarted","Data":"8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5"} Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.773690 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.775749 4713 scope.go:117] "RemoveContainer" containerID="2345286ba622f88cf52365f92bf3637004b5c8547fe7560e870332653d0ac5f6" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.777905 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-795f4d9bc7-g9wgf"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.783517 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.786735 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:28 crc kubenswrapper[4713]: I0308 00:12:28.789815 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6575bb6f8c-p6445"] Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.775316 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerStarted","Data":"a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04"} Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.775655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.781918 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:29 crc kubenswrapper[4713]: I0308 00:12:29.824675 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" podStartSLOduration=3.824651296 podStartE2EDuration="3.824651296s" podCreationTimestamp="2026-03-08 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:29.800314418 +0000 UTC m=+403.919946701" watchObservedRunningTime="2026-03-08 00:12:29.824651296 +0000 UTC m=+403.944283549" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.549036 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a0c57d-18d7-440f-aa59-4a55988fcd25" path="/var/lib/kubelet/pods/14a0c57d-18d7-440f-aa59-4a55988fcd25/volumes" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.549534 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" path="/var/lib/kubelet/pods/67b526e8-eda3-4eaf-b5ed-15ed74c51d76/volumes" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.686794 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:30 crc kubenswrapper[4713]: E0308 00:12:30.687075 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687092 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687263 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b526e8-eda3-4eaf-b5ed-15ed74c51d76" containerName="controller-manager" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.687712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.689471 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.690684 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.692084 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.692239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.695445 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.695889 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.698039 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.702716 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858871 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858951 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.858988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960438 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.960475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.961865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.961967 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.968521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.971356 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:30 crc kubenswrapper[4713]: I0308 00:12:30.977202 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"controller-manager-5f498ddbb5-wj976\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.008583 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.408913 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:31 crc kubenswrapper[4713]: W0308 00:12:31.412066 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28926f2e_f630_49fa_87f7_2c82067f06cc.slice/crio-0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240 WatchSource:0}: Error finding container 0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240: Status 404 returned error can't find the container with id 0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240 Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.786908 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerStarted","Data":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} Mar 08 00:12:31 crc kubenswrapper[4713]: I0308 00:12:31.786957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerStarted","Data":"0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240"} Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.793560 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.798054 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:32 crc kubenswrapper[4713]: I0308 00:12:32.815759 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" podStartSLOduration=6.8157424429999995 podStartE2EDuration="6.815742443s" podCreationTimestamp="2026-03-08 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:31.813134469 +0000 UTC m=+405.932766692" watchObservedRunningTime="2026-03-08 00:12:32.815742443 +0000 UTC m=+406.935374676" Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.570549 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.571622 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pd9br" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" containerID="cri-o://a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" gracePeriod=2 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.705609 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.705848 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x7pkf" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" containerID="cri-o://54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" gracePeriod=2 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.965480 4713 generic.go:334] "Generic (PLEG): container finished" podID="c33b42a1-bf95-490f-a907-765855ec81d1" containerID="54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" exitCode=0 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.965688 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496"} Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.969651 4713 generic.go:334] "Generic (PLEG): container finished" podID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerID="a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" exitCode=0 Mar 08 00:12:38 crc kubenswrapper[4713]: I0308 00:12:38.969678 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.054692 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162485 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.162552 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") pod \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\" (UID: \"cd4a956b-6edb-436e-bd5e-5d57899c2ea1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.163681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities" (OuterVolumeSpecName: "utilities") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.166580 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.168620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc" (OuterVolumeSpecName: "kube-api-access-9t4bc") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "kube-api-access-9t4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.232881 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd4a956b-6edb-436e-bd5e-5d57899c2ea1" (UID: "cd4a956b-6edb-436e-bd5e-5d57899c2ea1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263813 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263857 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4bc\" (UniqueName: \"kubernetes.io/projected/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-kube-api-access-9t4bc\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.263871 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4a956b-6edb-436e-bd5e-5d57899c2ea1-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.365432 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") pod \"c33b42a1-bf95-490f-a907-765855ec81d1\" (UID: \"c33b42a1-bf95-490f-a907-765855ec81d1\") " Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.366673 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities" (OuterVolumeSpecName: "utilities") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.368443 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.369558 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb" (OuterVolumeSpecName: "kube-api-access-7bjqb") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "kube-api-access-7bjqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.420185 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c33b42a1-bf95-490f-a907-765855ec81d1" (UID: "c33b42a1-bf95-490f-a907-765855ec81d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.469729 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjqb\" (UniqueName: \"kubernetes.io/projected/c33b42a1-bf95-490f-a907-765855ec81d1-kube-api-access-7bjqb\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.469762 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33b42a1-bf95-490f-a907-765855ec81d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976510 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd9br" event={"ID":"cd4a956b-6edb-436e-bd5e-5d57899c2ea1","Type":"ContainerDied","Data":"135e656a965d1b87bbb089b3e89dbd03d0497fd3df39d718203e4d15ec7454b9"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976869 4713 scope.go:117] "RemoveContainer" containerID="a032630e16097c96141079adebfc1092e90366030a54b1b60ed4f6c7681a4c79" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.976536 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd9br" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.979805 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7pkf" event={"ID":"c33b42a1-bf95-490f-a907-765855ec81d1","Type":"ContainerDied","Data":"8b84966b96c0ed6376bfb58ebe4d50727b2f7c4a888ad1b3e8b431d7574ba8b4"} Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.979951 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7pkf" Mar 08 00:12:39 crc kubenswrapper[4713]: I0308 00:12:39.997601 4713 scope.go:117] "RemoveContainer" containerID="c2bf098434bfcc867c8195b8c42297c739230b688ab856c67dbf7a34e9987066" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.007950 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.014281 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pd9br"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.020734 4713 scope.go:117] "RemoveContainer" containerID="10f6a682f68f33f52b960986a98e4b9b4d5d737c5be6429ad3ce071e85a28622" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.027876 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.031538 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x7pkf"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.037790 4713 scope.go:117] "RemoveContainer" containerID="54d94291bba3da410042a68b46eeee3f18e230b96de2843a430f6d4aa0771496" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.056250 4713 scope.go:117] "RemoveContainer" containerID="208d6f7268d01f9f7e50afe48b84246d8fc86cf25d817c7b3ce1701103741603" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.069424 4713 scope.go:117] "RemoveContainer" containerID="f219be814b1ac8475a83125ee5f48f62c739076f91025a6595fb3c6cc2132578" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.306451 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.306680 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hs88q" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" containerID="cri-o://023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" gracePeriod=2 Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.548708 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" path="/var/lib/kubelet/pods/c33b42a1-bf95-490f-a907-765855ec81d1/volumes" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.549447 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" path="/var/lib/kubelet/pods/cd4a956b-6edb-436e-bd5e-5d57899c2ea1/volumes" Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.989081 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerID="023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" exitCode=0 Mar 08 00:12:40 crc kubenswrapper[4713]: I0308 00:12:40.989166 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016"} Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.297150 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.305582 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.305788 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rdgpc" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" containerID="cri-o://bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" gracePeriod=2 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.471952 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.472427 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" containerID="cri-o://765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" gracePeriod=30 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492155 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.492373 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") pod \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\" (UID: \"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0\") " Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.493452 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities" (OuterVolumeSpecName: "utilities") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.497926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck" (OuterVolumeSpecName: "kube-api-access-sxjck") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "kube-api-access-sxjck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.516723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" (UID: "2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.570753 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.570958 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" containerID="cri-o://a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" gracePeriod=30 Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593744 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593780 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxjck\" (UniqueName: \"kubernetes.io/projected/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-kube-api-access-sxjck\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:41 crc kubenswrapper[4713]: I0308 00:12:41.593790 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.003871 4713 generic.go:334] "Generic (PLEG): container finished" podID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerID="a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.003926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerDied","Data":"a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" event={"ID":"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528","Type":"ContainerDied","Data":"8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004178 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f642808b84e4f9a7dbfc1946365248a00698721e0ae378e73c5caef95a3edb5" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.004562 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hs88q" event={"ID":"2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0","Type":"ContainerDied","Data":"6fcd739b02f335d950276fc5d35bedd4422940f74a80db12ae1da2ebc8d7061a"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007466 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hs88q" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.007490 4713 scope.go:117] "RemoveContainer" containerID="023ca4eb6026d184356661b957d297149cfe69e644ecd5ceb7a20eb3c76a9016" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016736 4713 generic.go:334] "Generic (PLEG): container finished" podID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016873 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerDied","Data":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.016907 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" event={"ID":"28926f2e-f630-49fa-87f7-2c82067f06cc","Type":"ContainerDied","Data":"0667e551eeb85abb81e933da12494d8a43adb1cf8dd34c05e62b52f4f8685240"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.017011 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f498ddbb5-wj976" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023331 4713 generic.go:334] "Generic (PLEG): container finished" podID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerID="bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" exitCode=0 Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023395 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269"} Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.023589 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.037537 4713 scope.go:117] "RemoveContainer" containerID="f5743c83cf849ed0707f05f9170f67beed9226bd36833eb3fea5238d2ff525b8" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.079009 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.081923 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hs88q"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.093184 4713 scope.go:117] "RemoveContainer" containerID="30fcbfe0635451c7fd3955c62a769f92ccede7936e36fa38580a85369fc7d85d" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.120501 4713 scope.go:117] "RemoveContainer" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.133274 4713 scope.go:117] "RemoveContainer" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.134166 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": container with ID starting with 765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687 not found: ID does not exist" containerID="765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.134198 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687"} err="failed to get container status \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": rpc error: code = NotFound desc = could not find container \"765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687\": container with ID starting with 765f68d8bc64d8c5a83f9e32f2b0ae7c66c88c6b731b6c17a50a000ff87ef687 not found: ID does not exist" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.195423 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200453 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200595 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200623 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") pod \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\" (UID: \"5c2c4a52-cb5b-4da1-9c2b-1bb839c14528\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.200640 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") pod \"28926f2e-f630-49fa-87f7-2c82067f06cc\" (UID: \"28926f2e-f630-49fa-87f7-2c82067f06cc\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201615 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config" (OuterVolumeSpecName: "config") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201887 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config" (OuterVolumeSpecName: "config") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.201962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.202187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.205850 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s" (OuterVolumeSpecName: "kube-api-access-wnk2s") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "kube-api-access-wnk2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206103 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206249 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" (UID: "5c2c4a52-cb5b-4da1-9c2b-1bb839c14528"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.206990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8" (OuterVolumeSpecName: "kube-api-access-gn4r8") pod "28926f2e-f630-49fa-87f7-2c82067f06cc" (UID: "28926f2e-f630-49fa-87f7-2c82067f06cc"). InnerVolumeSpecName "kube-api-access-gn4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.302864 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.302924 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303026 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") pod \"dcde95f7-8814-4319-8a48-6d186de5f51f\" (UID: \"dcde95f7-8814-4319-8a48-6d186de5f51f\") " Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303334 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303365 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28926f2e-f630-49fa-87f7-2c82067f06cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303383 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303396 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnk2s\" (UniqueName: \"kubernetes.io/projected/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-kube-api-access-wnk2s\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303409 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303420 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303431 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303443 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28926f2e-f630-49fa-87f7-2c82067f06cc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.303454 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn4r8\" (UniqueName: \"kubernetes.io/projected/28926f2e-f630-49fa-87f7-2c82067f06cc-kube-api-access-gn4r8\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.304003 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities" (OuterVolumeSpecName: "utilities") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.306085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f" (OuterVolumeSpecName: "kube-api-access-nmk7f") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "kube-api-access-nmk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.355239 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.358159 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f498ddbb5-wj976"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.404395 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.404426 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmk7f\" (UniqueName: \"kubernetes.io/projected/dcde95f7-8814-4319-8a48-6d186de5f51f-kube-api-access-nmk7f\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.430398 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcde95f7-8814-4319-8a48-6d186de5f51f" (UID: "dcde95f7-8814-4319-8a48-6d186de5f51f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.505769 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcde95f7-8814-4319-8a48-6d186de5f51f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.548331 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" path="/var/lib/kubelet/pods/28926f2e-f630-49fa-87f7-2c82067f06cc/volumes" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.549118 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" path="/var/lib/kubelet/pods/2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0/volumes" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943148 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943425 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943440 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943453 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943462 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943476 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943484 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943498 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943508 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943519 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943528 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943539 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943548 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943559 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943567 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943580 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943588 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943602 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943610 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943624 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943634 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943647 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943655 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943666 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943675 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943685 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943693 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="extract-content" Mar 08 00:12:42 crc kubenswrapper[4713]: E0308 00:12:42.943703 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943711 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="extract-utilities" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943845 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33b42a1-bf95-490f-a907-765855ec81d1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943859 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="28926f2e-f630-49fa-87f7-2c82067f06cc" containerName="controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943879 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" containerName="route-controller-manager" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943891 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4a956b-6edb-436e-bd5e-5d57899c2ea1" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943901 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef0ec0c-d1f7-4ed1-81d8-fe12497c15b0" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.943913 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" containerName="registry-server" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.944361 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.947571 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.949386 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.949703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.950118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.950331 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.952180 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.952273 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.953017 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.958254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.961288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:42 crc kubenswrapper[4713]: I0308 00:12:42.965026 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035906 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rdgpc" event={"ID":"dcde95f7-8814-4319-8a48-6d186de5f51f","Type":"ContainerDied","Data":"ef8b074d9efbef9bd1985cd1c77849aac1a6142c1203709657b5b6f697605e4e"} Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035952 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rdgpc" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.035970 4713 scope.go:117] "RemoveContainer" containerID="bd4a8e19339f53886f8e1f05d3792cb1bb29da3b9e4c6bc029a48012b0bfe269" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038479 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerID="f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf" exitCode=0 Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038544 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.038532 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerDied","Data":"f9566defd908e4b2b14ead5994a9afb7bc984f75e3c8235a78747cca1c95babf"} Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.052142 4713 scope.go:117] "RemoveContainer" containerID="811a7fecc13f433a775d8c8b046af8802008222a2688bfa3140a6cccdba2f8bb" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.058738 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.063786 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rdgpc"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.079982 4713 scope.go:117] "RemoveContainer" containerID="eb31791b33621b563ffdcd2c2e41bd769a0b407d0d7cbd536956a89ac412d5bb" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.089449 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.095778 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cccf86c6-zhfs5"] Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.112970 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113103 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113173 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113233 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.113334 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214137 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214187 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214210 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214241 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214272 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.214305 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.215359 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.215473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.216433 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.217308 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.218540 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.218611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.234402 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.243197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"route-controller-manager-86cddb879c-x9ppd\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.276639 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.294416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"controller-manager-854bb687b5-6d9zw\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.569801 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:43 crc kubenswrapper[4713]: I0308 00:12:43.695495 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.010195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.047874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerStarted","Data":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.048238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.048254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerStarted","Data":"f98a407edc02834035ff48f1d7184aacc2041ec72127750f74d7bd1587b0b9d2"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.049148 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerStarted","Data":"a828f2f69d7e7e4fb80afb9cc983c532df289af847177c4e0ec6d1fbe997c392"} Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.050148 4713 patch_prober.go:28] interesting pod/route-controller-manager-86cddb879c-x9ppd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.050179 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.067647 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podStartSLOduration=3.067624726 podStartE2EDuration="3.067624726s" podCreationTimestamp="2026-03-08 00:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:44.065748847 +0000 UTC m=+418.185381080" watchObservedRunningTime="2026-03-08 00:12:44.067624726 +0000 UTC m=+418.187256969" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.268270 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.430548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") pod \"2ab8d84d-9110-4bed-8288-4764d7c10f74\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.430602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") pod \"2ab8d84d-9110-4bed-8288-4764d7c10f74\" (UID: \"2ab8d84d-9110-4bed-8288-4764d7c10f74\") " Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.431580 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca" (OuterVolumeSpecName: "serviceca") pod "2ab8d84d-9110-4bed-8288-4764d7c10f74" (UID: "2ab8d84d-9110-4bed-8288-4764d7c10f74"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.438405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw" (OuterVolumeSpecName: "kube-api-access-rtmqw") pod "2ab8d84d-9110-4bed-8288-4764d7c10f74" (UID: "2ab8d84d-9110-4bed-8288-4764d7c10f74"). InnerVolumeSpecName "kube-api-access-rtmqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.531997 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtmqw\" (UniqueName: \"kubernetes.io/projected/2ab8d84d-9110-4bed-8288-4764d7c10f74-kube-api-access-rtmqw\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.532028 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ab8d84d-9110-4bed-8288-4764d7c10f74-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.548266 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2c4a52-cb5b-4da1-9c2b-1bb839c14528" path="/var/lib/kubelet/pods/5c2c4a52-cb5b-4da1-9c2b-1bb839c14528/volumes" Mar 08 00:12:44 crc kubenswrapper[4713]: I0308 00:12:44.548724 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcde95f7-8814-4319-8a48-6d186de5f51f" path="/var/lib/kubelet/pods/dcde95f7-8814-4319-8a48-6d186de5f51f/volumes" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29548800-ghv4d" event={"ID":"2ab8d84d-9110-4bed-8288-4764d7c10f74","Type":"ContainerDied","Data":"6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8"} Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060374 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbb096291ab484496304a21d48e0c187a353974f802449b0a324f5c483976f8" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.060206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29548800-ghv4d" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.063064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerStarted","Data":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.063460 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.069193 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.069655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:12:45 crc kubenswrapper[4713]: I0308 00:12:45.095571 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" podStartSLOduration=4.095548315 podStartE2EDuration="4.095548315s" podCreationTimestamp="2026-03-08 00:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:12:45.087202977 +0000 UTC m=+419.206835230" watchObservedRunningTime="2026-03-08 00:12:45.095548315 +0000 UTC m=+419.215180558" Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.396805 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.397561 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" containerID="cri-o://1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" gracePeriod=30 Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.410104 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.410585 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" containerID="cri-o://e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" gracePeriod=30 Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.976469 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:13:01 crc kubenswrapper[4713]: I0308 00:13:01.993241 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.136946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.136997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137843 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca" (OuterVolumeSpecName: "client-ca") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.137973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca" (OuterVolumeSpecName: "client-ca") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138050 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138100 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138117 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138139 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") pod \"97ffa397-8c2d-4614-81c8-f0bd196db252\" (UID: \"97ffa397-8c2d-4614-81c8-f0bd196db252\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") pod \"d80407f9-98a7-488a-aba0-f718da170a35\" (UID: \"d80407f9-98a7-488a-aba0-f718da170a35\") " Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138509 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config" (OuterVolumeSpecName: "config") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138549 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.138874 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config" (OuterVolumeSpecName: "config") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139077 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139125 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139138 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139146 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ffa397-8c2d-4614-81c8-f0bd196db252-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.139155 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d80407f9-98a7-488a-aba0-f718da170a35-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142277 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142328 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.142944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf" (OuterVolumeSpecName: "kube-api-access-wb5vf") pod "97ffa397-8c2d-4614-81c8-f0bd196db252" (UID: "97ffa397-8c2d-4614-81c8-f0bd196db252"). InnerVolumeSpecName "kube-api-access-wb5vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.143800 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj" (OuterVolumeSpecName: "kube-api-access-6pnqj") pod "d80407f9-98a7-488a-aba0-f718da170a35" (UID: "d80407f9-98a7-488a-aba0-f718da170a35"). InnerVolumeSpecName "kube-api-access-6pnqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155030 4713 generic.go:334] "Generic (PLEG): container finished" podID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" exitCode=0 Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerDied","Data":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" event={"ID":"97ffa397-8c2d-4614-81c8-f0bd196db252","Type":"ContainerDied","Data":"f98a407edc02834035ff48f1d7184aacc2041ec72127750f74d7bd1587b0b9d2"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155151 4713 scope.go:117] "RemoveContainer" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.155268 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159344 4713 generic.go:334] "Generic (PLEG): container finished" podID="d80407f9-98a7-488a-aba0-f718da170a35" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" exitCode=0 Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159371 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159396 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerDied","Data":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.159430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854bb687b5-6d9zw" event={"ID":"d80407f9-98a7-488a-aba0-f718da170a35","Type":"ContainerDied","Data":"a828f2f69d7e7e4fb80afb9cc983c532df289af847177c4e0ec6d1fbe997c392"} Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.172737 4713 scope.go:117] "RemoveContainer" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.173448 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": container with ID starting with e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4 not found: ID does not exist" containerID="e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.173477 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4"} err="failed to get container status \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": rpc error: code = NotFound desc = could not find container \"e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4\": container with ID starting with e77a05b39ea8975e4c9eb1dc5876f187e0cb360fa48f54f8bb6ea89f77ca58a4 not found: ID does not exist" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.173499 4713 scope.go:117] "RemoveContainer" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.181325 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.184378 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86cddb879c-x9ppd"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.190781 4713 scope.go:117] "RemoveContainer" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.191233 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": container with ID starting with 1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda not found: ID does not exist" containerID="1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.191268 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda"} err="failed to get container status \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": rpc error: code = NotFound desc = could not find container \"1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda\": container with ID starting with 1de3ab5a5cf66f375f3d44be2148831ab8737f85f3f740b34100021c82990dda not found: ID does not exist" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.192425 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.197049 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-854bb687b5-6d9zw"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240871 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5vf\" (UniqueName: \"kubernetes.io/projected/97ffa397-8c2d-4614-81c8-f0bd196db252-kube-api-access-wb5vf\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240910 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pnqj\" (UniqueName: \"kubernetes.io/projected/d80407f9-98a7-488a-aba0-f718da170a35-kube-api-access-6pnqj\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240923 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffa397-8c2d-4614-81c8-f0bd196db252-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.240934 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80407f9-98a7-488a-aba0-f718da170a35-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.548184 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" path="/var/lib/kubelet/pods/97ffa397-8c2d-4614-81c8-f0bd196db252/volumes" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.548688 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80407f9-98a7-488a-aba0-f718da170a35" path="/var/lib/kubelet/pods/d80407f9-98a7-488a-aba0-f718da170a35/volumes" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956352 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956900 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956916 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956931 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956939 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: E0308 00:13:02.956950 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.956958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957053 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab8d84d-9110-4bed-8288-4764d7c10f74" containerName="image-pruner" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957064 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ffa397-8c2d-4614-81c8-f0bd196db252" containerName="route-controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957071 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80407f9-98a7-488a-aba0-f718da170a35" containerName="controller-manager" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.957409 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.960240 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.960733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962027 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962303 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.962408 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964297 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.964879 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.965075 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.965290 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.971963 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.973582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.973668 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.976105 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 00:13:02 crc kubenswrapper[4713]: I0308 00:13:02.981118 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150675 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150780 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150805 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.150968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.151005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.151036 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252132 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252158 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252236 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252314 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.252340 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253474 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-client-ca\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.253898 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-proxy-ca-bundles\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.254320 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.254424 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef732ea-c325-44b3-9624-63ea4f20e3c5-config\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.256810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.257312 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef732ea-c325-44b3-9624-63ea4f20e3c5-serving-cert\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.270213 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"route-controller-manager-b4cc9495d-jlqd7\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.275750 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjrn\" (UniqueName: \"kubernetes.io/projected/6ef732ea-c325-44b3-9624-63ea4f20e3c5-kube-api-access-hnjrn\") pod \"controller-manager-565fb68b56-2gcqx\" (UID: \"6ef732ea-c325-44b3-9624-63ea4f20e3c5\") " pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.279958 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.288753 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.671655 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565fb68b56-2gcqx"] Mar 08 00:13:03 crc kubenswrapper[4713]: I0308 00:13:03.778482 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:03 crc kubenswrapper[4713]: W0308 00:13:03.781019 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf42dbb0_d1f1_44a1_8f0f_f26bcae1ec2f.slice/crio-625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c WatchSource:0}: Error finding container 625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c: Status 404 returned error can't find the container with id 625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172260 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" event={"ID":"6ef732ea-c325-44b3-9624-63ea4f20e3c5","Type":"ContainerStarted","Data":"12dcbf3a5435ed5281f646f5d6ca495ee6e9e4efd37433b82af66cc6b99c1ca7"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172642 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.172661 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" event={"ID":"6ef732ea-c325-44b3-9624-63ea4f20e3c5","Type":"ContainerStarted","Data":"7d6af5756e571ffd3e794b0bf99d5433d1152e4315fafa69d81b14c70429744d"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.173703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerStarted","Data":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.173758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerStarted","Data":"625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c"} Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.174210 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.178403 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.180751 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.187581 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-565fb68b56-2gcqx" podStartSLOduration=3.187561811 podStartE2EDuration="3.187561811s" podCreationTimestamp="2026-03-08 00:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:04.186035011 +0000 UTC m=+438.305667264" watchObservedRunningTime="2026-03-08 00:13:04.187561811 +0000 UTC m=+438.307194044" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.257610 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" podStartSLOduration=3.2575753069999998 podStartE2EDuration="3.257575307s" podCreationTimestamp="2026-03-08 00:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:04.254553428 +0000 UTC m=+438.374185681" watchObservedRunningTime="2026-03-08 00:13:04.257575307 +0000 UTC m=+438.377207540" Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.500849 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:13:04 crc kubenswrapper[4713]: I0308 00:13:04.500923 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.392531 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.393329 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" containerID="cri-o://b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" gracePeriod=30 Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.876778 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946174 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946616 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.946669 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") pod \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\" (UID: \"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f\") " Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.947176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.947236 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config" (OuterVolumeSpecName: "config") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.952702 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:21 crc kubenswrapper[4713]: I0308 00:13:21.953429 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh" (OuterVolumeSpecName: "kube-api-access-jxwhh") pod "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" (UID: "bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f"). InnerVolumeSpecName "kube-api-access-jxwhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048033 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048062 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048076 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwhh\" (UniqueName: \"kubernetes.io/projected/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-kube-api-access-jxwhh\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.048087 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268461 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" exitCode=0 Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerDied","Data":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" event={"ID":"bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f","Type":"ContainerDied","Data":"625788b3abcc99d5de48b9c586df8a5a324171e19f6fe5959a7369dde47d6f2c"} Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268586 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.268656 4713 scope.go:117] "RemoveContainer" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.284783 4713 scope.go:117] "RemoveContainer" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: E0308 00:13:22.285274 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": container with ID starting with b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c not found: ID does not exist" containerID="b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.285317 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c"} err="failed to get container status \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": rpc error: code = NotFound desc = could not find container \"b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c\": container with ID starting with b3abd0c083801939f433ca544c64b04f93c4cb7413cde9fdfa35b6d07230fe7c not found: ID does not exist" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.308954 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.312228 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b4cc9495d-jlqd7"] Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.549177 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" path="/var/lib/kubelet/pods/bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f/volumes" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.967517 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:22 crc kubenswrapper[4713]: E0308 00:13:22.968176 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968256 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968406 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf42dbb0-d1f1-44a1-8f0f-f26bcae1ec2f" containerName="route-controller-manager" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.968797 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.971215 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.971588 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972013 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972544 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.972924 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.973228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 00:13:22 crc kubenswrapper[4713]: I0308 00:13:22.979860 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063165 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063198 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.063232 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.163916 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.163989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.164054 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.164086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.165044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-client-ca\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.165376 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741fdcbc-fc9d-499a-958e-0e605cb9a874-config\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.167921 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741fdcbc-fc9d-499a-958e-0e605cb9a874-serving-cert\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.184633 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2gr\" (UniqueName: \"kubernetes.io/projected/741fdcbc-fc9d-499a-958e-0e605cb9a874-kube-api-access-xw2gr\") pod \"route-controller-manager-86b9c7bbc4-q2clq\" (UID: \"741fdcbc-fc9d-499a-958e-0e605cb9a874\") " pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.286630 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:23 crc kubenswrapper[4713]: I0308 00:13:23.674631 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq"] Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279294 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" event={"ID":"741fdcbc-fc9d-499a-958e-0e605cb9a874","Type":"ContainerStarted","Data":"dece7a0e6f49a84784dc86cc91c70b8267cc1d04fbfe058516aeb00cf435ee85"} Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" event={"ID":"741fdcbc-fc9d-499a-958e-0e605cb9a874","Type":"ContainerStarted","Data":"73ffdca953cd89eabe24a66d50ad9de20666a72933814c8c2e19bd8e6aa00922"} Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.279628 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.285745 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" Mar 08 00:13:24 crc kubenswrapper[4713]: I0308 00:13:24.298413 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86b9c7bbc4-q2clq" podStartSLOduration=3.298392686 podStartE2EDuration="3.298392686s" podCreationTimestamp="2026-03-08 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:24.294490624 +0000 UTC m=+458.414122867" watchObservedRunningTime="2026-03-08 00:13:24.298392686 +0000 UTC m=+458.418024929" Mar 08 00:13:34 crc kubenswrapper[4713]: I0308 00:13:34.501030 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:13:34 crc kubenswrapper[4713]: I0308 00:13:34.501979 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.331792 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.333435 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" containerID="cri-o://99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.347847 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.348355 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" containerID="cri-o://e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.355902 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.356174 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" containerID="cri-o://fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.370025 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.370927 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5hssk" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" containerID="cri-o://4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.375126 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.375444 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-57pjt" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" containerID="cri-o://4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" gracePeriod=30 Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.379307 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.380268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.391220 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485025 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.485157 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.585816 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.587295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.592805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.602860 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqqr\" (UniqueName: \"kubernetes.io/projected/26e0cfc6-458c-4be3-b57c-1cd5fad657c4-kube-api-access-fhqqr\") pod \"marketplace-operator-79b997595-4bm59\" (UID: \"26e0cfc6-458c-4be3-b57c-1cd5fad657c4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.710217 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.846607 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847389 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847653 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:50 crc kubenswrapper[4713]: E0308 00:13:50.847743 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-x6gcb" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.933528 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p9hqz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 00:13:50 crc kubenswrapper[4713]: I0308 00:13:50.933591 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.042789 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043276 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043614 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.043676 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4tj99" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.114547 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bm59"] Mar 08 00:13:51 crc kubenswrapper[4713]: W0308 00:13:51.128233 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e0cfc6_458c_4be3_b57c_1cd5fad657c4.slice/crio-4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478 WatchSource:0}: Error finding container 4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478: Status 404 returned error can't find the container with id 4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.367566 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.447240 4713 generic.go:334] "Generic (PLEG): container finished" podID="40864d72-e137-478e-8340-8c0f107b4c60" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.447327 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450029 4713 generic.go:334] "Generic (PLEG): container finished" podID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.450880 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" event={"ID":"26e0cfc6-458c-4be3-b57c-1cd5fad657c4","Type":"ContainerStarted","Data":"4a98aca99092786cfe5fa97a753e75d75ea88d114f04bef2cdee1d3307f8e478"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452621 4713 generic.go:334] "Generic (PLEG): container finished" podID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hssk" event={"ID":"822fdb72-7e7f-441b-8ebc-178ef46cca73","Type":"ContainerDied","Data":"fcc1f03f798c9a1497a249637518dbb0a71923b3eba6d35aa4080c621862fa0f"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452754 4713 scope.go:117] "RemoveContainer" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.452954 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hssk" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.461923 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e570b68-8b4c-42e3-839d-f37943999246" containerID="fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.461999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerDied","Data":"fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.469781 4713 generic.go:334] "Generic (PLEG): container finished" podID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerID="4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" exitCode=0 Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.469826 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1"} Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.480480 4713 scope.go:117] "RemoveContainer" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.498377 4713 scope.go:117] "RemoveContainer" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499480 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499513 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.499641 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") pod \"822fdb72-7e7f-441b-8ebc-178ef46cca73\" (UID: \"822fdb72-7e7f-441b-8ebc-178ef46cca73\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.500630 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities" (OuterVolumeSpecName: "utilities") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.516447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97" (OuterVolumeSpecName: "kube-api-access-bsx97") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "kube-api-access-bsx97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.534970 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "822fdb72-7e7f-441b-8ebc-178ef46cca73" (UID: "822fdb72-7e7f-441b-8ebc-178ef46cca73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.577550 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.581283 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582055 4713 scope.go:117] "RemoveContainer" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582303 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": container with ID starting with 4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817 not found: ID does not exist" containerID="4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582330 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817"} err="failed to get container status \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": rpc error: code = NotFound desc = could not find container \"4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817\": container with ID starting with 4cfc44af3acab9f9da37265b5df0c44c4ce8481c6b73a6a1c6911e1394713817 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582348 4713 scope.go:117] "RemoveContainer" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582573 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": container with ID starting with 524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133 not found: ID does not exist" containerID="524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582599 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133"} err="failed to get container status \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": rpc error: code = NotFound desc = could not find container \"524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133\": container with ID starting with 524dfa3729d8726beb09ae412f7321389ba47ef0624fa7d2798a1f20145b2133 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582614 4713 scope.go:117] "RemoveContainer" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: E0308 00:13:51.582853 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": container with ID starting with fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1 not found: ID does not exist" containerID="fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.582875 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1"} err="failed to get container status \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": rpc error: code = NotFound desc = could not find container \"fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1\": container with ID starting with fa81935375891e84987b059dfdea9629b743e60a7365748b113fb9a50d109ab1 not found: ID does not exist" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.586585 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.596643 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601334 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsx97\" (UniqueName: \"kubernetes.io/projected/822fdb72-7e7f-441b-8ebc-178ef46cca73-kube-api-access-bsx97\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601360 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.601368 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822fdb72-7e7f-441b-8ebc-178ef46cca73-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701985 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.701999 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702070 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") pod \"9e570b68-8b4c-42e3-839d-f37943999246\" (UID: \"9e570b68-8b4c-42e3-839d-f37943999246\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702091 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702110 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") pod \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\" (UID: \"e23a30a2-2bf8-451e-b85b-b293e8949e9e\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702152 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") pod \"d9341928-7a63-4190-ac37-ac9ba3320e18\" (UID: \"d9341928-7a63-4190-ac37-ac9ba3320e18\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.702182 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") pod \"40864d72-e137-478e-8340-8c0f107b4c60\" (UID: \"40864d72-e137-478e-8340-8c0f107b4c60\") " Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.703079 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities" (OuterVolumeSpecName: "utilities") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.703163 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities" (OuterVolumeSpecName: "utilities") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.704693 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities" (OuterVolumeSpecName: "utilities") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705268 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2" (OuterVolumeSpecName: "kube-api-access-m8fx2") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "kube-api-access-m8fx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.705335 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2" (OuterVolumeSpecName: "kube-api-access-795x2") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "kube-api-access-795x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.707643 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9e570b68-8b4c-42e3-839d-f37943999246" (UID: "9e570b68-8b4c-42e3-839d-f37943999246"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.712043 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss" (OuterVolumeSpecName: "kube-api-access-kfdss") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "kube-api-access-kfdss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.712161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn" (OuterVolumeSpecName: "kube-api-access-prrdn") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "kube-api-access-prrdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.772673 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40864d72-e137-478e-8340-8c0f107b4c60" (UID: "40864d72-e137-478e-8340-8c0f107b4c60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.785058 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.785803 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9341928-7a63-4190-ac37-ac9ba3320e18" (UID: "d9341928-7a63-4190-ac37-ac9ba3320e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.788555 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hssk"] Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804101 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795x2\" (UniqueName: \"kubernetes.io/projected/9e570b68-8b4c-42e3-839d-f37943999246-kube-api-access-795x2\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804138 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804151 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804165 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfdss\" (UniqueName: \"kubernetes.io/projected/e23a30a2-2bf8-451e-b85b-b293e8949e9e-kube-api-access-kfdss\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804176 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prrdn\" (UniqueName: \"kubernetes.io/projected/d9341928-7a63-4190-ac37-ac9ba3320e18-kube-api-access-prrdn\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804187 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40864d72-e137-478e-8340-8c0f107b4c60-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804197 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804209 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9341928-7a63-4190-ac37-ac9ba3320e18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804219 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8fx2\" (UniqueName: \"kubernetes.io/projected/40864d72-e137-478e-8340-8c0f107b4c60-kube-api-access-m8fx2\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804229 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e570b68-8b4c-42e3-839d-f37943999246-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.804239 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.834277 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e23a30a2-2bf8-451e-b85b-b293e8949e9e" (UID: "e23a30a2-2bf8-451e-b85b-b293e8949e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:13:51 crc kubenswrapper[4713]: I0308 00:13:51.905946 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23a30a2-2bf8-451e-b85b-b293e8949e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.032759 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" event={"ID":"9e570b68-8b4c-42e3-839d-f37943999246","Type":"ContainerDied","Data":"8a2d896d73aedf449a67c5c1becd624d05fd0cc1bac64192c1528302ec9e1810"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477529 4713 scope.go:117] "RemoveContainer" containerID="fd9a48944f15c013216b1e59cc31e3539b1ac73b38b0051a0a81749066e50d41" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.477089 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p9hqz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.488676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57pjt" event={"ID":"e23a30a2-2bf8-451e-b85b-b293e8949e9e","Type":"ContainerDied","Data":"7c30588800e0dac5ab38807a23f6184382c53099e569400f6073fb7739048d46"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.488886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57pjt" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.493544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4tj99" event={"ID":"40864d72-e137-478e-8340-8c0f107b4c60","Type":"ContainerDied","Data":"3cdea3678803ad7453d0a386b7a4a0468a866e4a3767422ad83b05a97ef4bf14"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.493672 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4tj99" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.495859 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6gcb" event={"ID":"d9341928-7a63-4190-ac37-ac9ba3320e18","Type":"ContainerDied","Data":"8da0f0760030352f0e71a9d8d27a1069de63fe3b39a327ba9c1b618d352e4f81"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.496062 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6gcb" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.499377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" event={"ID":"26e0cfc6-458c-4be3-b57c-1cd5fad657c4","Type":"ContainerStarted","Data":"3b2176370935e6a2e1310e78999dfe2021e4e97c1e8a1c47e184b64c068dff71"} Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.499991 4713 scope.go:117] "RemoveContainer" containerID="4ed848ed6abb07f4a89c3ace3ce761bce0134ceff6e51ed39e7ca6d27a1477c1" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.509596 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.518553 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p9hqz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.525920 4713 scope.go:117] "RemoveContainer" containerID="71df55d2c41e29b364984f11829b378396c7e97525399c55ef7102e7db5b6a0a" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.532463 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.536539 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-57pjt"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.548896 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" path="/var/lib/kubelet/pods/822fdb72-7e7f-441b-8ebc-178ef46cca73/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.549787 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e570b68-8b4c-42e3-839d-f37943999246" path="/var/lib/kubelet/pods/9e570b68-8b4c-42e3-839d-f37943999246/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.550393 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" path="/var/lib/kubelet/pods/e23a30a2-2bf8-451e-b85b-b293e8949e9e/volumes" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.555676 4713 scope.go:117] "RemoveContainer" containerID="99ba221bc55466be0084d80442d6dec86c90deadbc054c19ec89fd1d01900208" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558435 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558737 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558748 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558756 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558767 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558775 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558795 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558803 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558814 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558840 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558853 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558860 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558870 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558879 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558890 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558897 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558906 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558913 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558923 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558931 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="extract-utilities" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558940 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558947 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558958 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558968 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: E0308 00:13:52.558981 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.558989 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="extract-content" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559099 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559111 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e570b68-8b4c-42e3-839d-f37943999246" containerName="marketplace-operator" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559124 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="40864d72-e137-478e-8340-8c0f107b4c60" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559132 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23a30a2-2bf8-451e-b85b-b293e8949e9e" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559140 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="822fdb72-7e7f-441b-8ebc-178ef46cca73" containerName="registry-server" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.559790 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.561279 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.562900 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" podStartSLOduration=2.562874129 podStartE2EDuration="2.562874129s" podCreationTimestamp="2026-03-08 00:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:13:52.555388493 +0000 UTC m=+486.675020726" watchObservedRunningTime="2026-03-08 00:13:52.562874129 +0000 UTC m=+486.682506372" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.582681 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.591299 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.594249 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6gcb"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.597955 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.603562 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4tj99"] Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.608505 4713 scope.go:117] "RemoveContainer" containerID="e4df11f30a00eeb8975bf590dfcc99035d1dbd89952445cfb19e1aa26d7407f6" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615803 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.615943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.624693 4713 scope.go:117] "RemoveContainer" containerID="46ee2fecb258f3bbeadd642b9e3423768d2062de8a5dd3a187b3ace78fd14497" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.640440 4713 scope.go:117] "RemoveContainer" containerID="b521ece8028ebf9207946445f9aecae87b7e5c6d252fd707c34dc0276256c2c0" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.654483 4713 scope.go:117] "RemoveContainer" containerID="99dd020645e7b6695acb2f758f9b98023643a329f5c7e44db6eec7c1278babd6" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.671137 4713 scope.go:117] "RemoveContainer" containerID="c0124cd1b5219c688a51426a00c55773b87427b1a16957ad745e3fd3a1ca06b1" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.685977 4713 scope.go:117] "RemoveContainer" containerID="e4404a3c0caa01e5acd1c3db2a69f4b96b4d1f768431d32a330b55a8351235db" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717624 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.717660 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.718117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.718179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.736184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"redhat-marketplace-4m4tz\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:52 crc kubenswrapper[4713]: I0308 00:13:52.903189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.285286 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.507318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"872b442fcf53dc350c20c113c6415793cd135f6045c9203dc5387eb2fa9f45e6"} Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.511108 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:53 crc kubenswrapper[4713]: I0308 00:13:53.513483 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4bm59" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.517422 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" exitCode=0 Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.517520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6"} Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.555149 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40864d72-e137-478e-8340-8c0f107b4c60" path="/var/lib/kubelet/pods/40864d72-e137-478e-8340-8c0f107b4c60/volumes" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.555760 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9341928-7a63-4190-ac37-ac9ba3320e18" path="/var/lib/kubelet/pods/d9341928-7a63-4190-ac37-ac9ba3320e18/volumes" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.747887 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.749072 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.751223 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.752385 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847511 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.847659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.946651 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949168 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949674 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-catalog-content\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.949951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd52d225-2e7e-4958-98fc-52028b545353-utilities\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.950199 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.952638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.958561 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:54 crc kubenswrapper[4713]: I0308 00:13:54.978118 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mz5n\" (UniqueName: \"kubernetes.io/projected/dd52d225-2e7e-4958-98fc-52028b545353-kube-api-access-6mz5n\") pod \"community-operators-rc7p9\" (UID: \"dd52d225-2e7e-4958-98fc-52028b545353\") " pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050153 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.050281 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:55 crc kubenswrapper[4713]: I0308 00:13:55.067576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152013 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.152477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.153040 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-utilities\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.153162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce49dca5-e07d-416e-a72d-281928ff343b-catalog-content\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.168689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2jx\" (UniqueName: \"kubernetes.io/projected/ce49dca5-e07d-416e-a72d-281928ff343b-kube-api-access-fb2jx\") pod \"certified-operators-mn4rt\" (UID: \"ce49dca5-e07d-416e-a72d-281928ff343b\") " pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:55.266061 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:56.527957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.149434 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.151271 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.156085 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.169000 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275848 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.275885 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.376933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.377365 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-catalog-content\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.377516 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47027c84-0848-4140-bed0-b04f627cf6da-utilities\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.398928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pgm\" (UniqueName: \"kubernetes.io/projected/47027c84-0848-4140-bed0-b04f627cf6da-kube-api-access-s9pgm\") pod \"redhat-operators-4b75j\" (UID: \"47027c84-0848-4140-bed0-b04f627cf6da\") " pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.471764 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.535268 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" exitCode=0 Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.535311 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.660649 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mn4rt"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.671903 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc7p9"] Mar 08 00:13:57 crc kubenswrapper[4713]: I0308 00:13:57.881965 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4b75j"] Mar 08 00:13:57 crc kubenswrapper[4713]: W0308 00:13:57.945192 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47027c84_0848_4140_bed0_b04f627cf6da.slice/crio-9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985 WatchSource:0}: Error finding container 9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985: Status 404 returned error can't find the container with id 9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.542800 4713 generic.go:334] "Generic (PLEG): container finished" podID="47027c84-0848-4140-bed0-b04f627cf6da" containerID="9a0df9293f72faa7276bbe231d291ea87223054854a62c6b1b5c4bd0259e51c3" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.549194 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd52d225-2e7e-4958-98fc-52028b545353" containerID="aadbf7018d16076dbc657aee64f072e5fa75d59cee7dfc64efd4d955bb09047f" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551736 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerDied","Data":"9a0df9293f72faa7276bbe231d291ea87223054854a62c6b1b5c4bd0259e51c3"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551784 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerStarted","Data":"9efc9a2ae1682110099628cdef40f19a90b3fda72d562db40a9d77a16b847985"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerDied","Data":"aadbf7018d16076dbc657aee64f072e5fa75d59cee7dfc64efd4d955bb09047f"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.551809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerStarted","Data":"da176d9ce501a71e28b9b129ce4463db6fd643bac822e26967dcf30bf45fd6d1"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.553355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerStarted","Data":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559778 4713 generic.go:334] "Generic (PLEG): container finished" podID="ce49dca5-e07d-416e-a72d-281928ff343b" containerID="c9c8290700ae32e35f4e8c1fbafbcb84417ece9a1cd89281d73bf49c5bff9d55" exitCode=0 Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559832 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerDied","Data":"c9c8290700ae32e35f4e8c1fbafbcb84417ece9a1cd89281d73bf49c5bff9d55"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.559855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerStarted","Data":"c1baf1e2075d9562517317f11a3a8fc622ea3dc337446ca35af5596187bdc0e8"} Mar 08 00:13:58 crc kubenswrapper[4713]: I0308 00:13:58.626253 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4m4tz" podStartSLOduration=3.084133765 podStartE2EDuration="6.626229212s" podCreationTimestamp="2026-03-08 00:13:52 +0000 UTC" firstStartedPulling="2026-03-08 00:13:54.518688993 +0000 UTC m=+488.638321226" lastFinishedPulling="2026-03-08 00:13:58.06078443 +0000 UTC m=+492.180416673" observedRunningTime="2026-03-08 00:13:58.597059438 +0000 UTC m=+492.716691671" watchObservedRunningTime="2026-03-08 00:13:58.626229212 +0000 UTC m=+492.745861445" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.134959 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.136065 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.137709 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.137961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.138198 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.146933 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.210063 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.311166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.342682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"auto-csr-approver-29548814-v94cz\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.452867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.583250 4713 generic.go:334] "Generic (PLEG): container finished" podID="ce49dca5-e07d-416e-a72d-281928ff343b" containerID="205e9cc478dd42700f6421ca490ab2b0f6325662ad101bb7df497af6f7e2ab66" exitCode=0 Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.583338 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerDied","Data":"205e9cc478dd42700f6421ca490ab2b0f6325662ad101bb7df497af6f7e2ab66"} Mar 08 00:14:00 crc kubenswrapper[4713]: I0308 00:14:00.916213 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.590450 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd52d225-2e7e-4958-98fc-52028b545353" containerID="556cce46681884709e1c392b6e28d24a72979eb3cd29aad02ebd53c0a0257993" exitCode=0 Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.590519 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerDied","Data":"556cce46681884709e1c392b6e28d24a72979eb3cd29aad02ebd53c0a0257993"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.592044 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerStarted","Data":"1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.594333 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mn4rt" event={"ID":"ce49dca5-e07d-416e-a72d-281928ff343b","Type":"ContainerStarted","Data":"487b88ed04c9e30985de38c1060c602641dcbeb5ac418265f637727b6a07135b"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.596793 4713 generic.go:334] "Generic (PLEG): container finished" podID="47027c84-0848-4140-bed0-b04f627cf6da" containerID="8df488645c07a85b328b7cb34047ade461a7d16e4ebff0de97f353a98741b972" exitCode=0 Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.596874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerDied","Data":"8df488645c07a85b328b7cb34047ade461a7d16e4ebff0de97f353a98741b972"} Mar 08 00:14:01 crc kubenswrapper[4713]: I0308 00:14:01.648749 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mn4rt" podStartSLOduration=4.8874088239999995 podStartE2EDuration="7.648729358s" podCreationTimestamp="2026-03-08 00:13:54 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.566064126 +0000 UTC m=+492.685696359" lastFinishedPulling="2026-03-08 00:14:01.32738466 +0000 UTC m=+495.447016893" observedRunningTime="2026-03-08 00:14:01.646046717 +0000 UTC m=+495.765678970" watchObservedRunningTime="2026-03-08 00:14:01.648729358 +0000 UTC m=+495.768361591" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.904456 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.904775 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:02 crc kubenswrapper[4713]: I0308 00:14:02.950672 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.608778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc7p9" event={"ID":"dd52d225-2e7e-4958-98fc-52028b545353","Type":"ContainerStarted","Data":"96abd5ee2356fcc5329aa327b08fceb46b417e579b158308fd457699b9419ea4"} Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.613610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerStarted","Data":"dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53"} Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.647949 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548814-v94cz" podStartSLOduration=1.618770863 podStartE2EDuration="3.647928127s" podCreationTimestamp="2026-03-08 00:14:00 +0000 UTC" firstStartedPulling="2026-03-08 00:14:00.929198769 +0000 UTC m=+495.048831002" lastFinishedPulling="2026-03-08 00:14:02.958356043 +0000 UTC m=+497.077988266" observedRunningTime="2026-03-08 00:14:03.64611388 +0000 UTC m=+497.765746123" watchObservedRunningTime="2026-03-08 00:14:03.647928127 +0000 UTC m=+497.767560360" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.649539 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rc7p9" podStartSLOduration=5.170337096 podStartE2EDuration="9.649531829s" podCreationTimestamp="2026-03-08 00:13:54 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.55054455 +0000 UTC m=+492.670176783" lastFinishedPulling="2026-03-08 00:14:03.029739293 +0000 UTC m=+497.149371516" observedRunningTime="2026-03-08 00:14:03.630796618 +0000 UTC m=+497.750428861" watchObservedRunningTime="2026-03-08 00:14:03.649531829 +0000 UTC m=+497.769164092" Mar 08 00:14:03 crc kubenswrapper[4713]: I0308 00:14:03.661268 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.501961 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502256 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502314 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502927 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.502983 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" gracePeriod=600 Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.626515 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4b75j" event={"ID":"47027c84-0848-4140-bed0-b04f627cf6da","Type":"ContainerStarted","Data":"5f7880248ba24ca09a433e4b3f7504ae02a28a23e62dd8888c6a6f16a95d5a69"} Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.627781 4713 generic.go:334] "Generic (PLEG): container finished" podID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerID="dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53" exitCode=0 Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.627863 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerDied","Data":"dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53"} Mar 08 00:14:04 crc kubenswrapper[4713]: I0308 00:14:04.644429 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4b75j" podStartSLOduration=2.620818456 podStartE2EDuration="7.644412571s" podCreationTimestamp="2026-03-08 00:13:57 +0000 UTC" firstStartedPulling="2026-03-08 00:13:58.544427329 +0000 UTC m=+492.664059562" lastFinishedPulling="2026-03-08 00:14:03.568021444 +0000 UTC m=+497.687653677" observedRunningTime="2026-03-08 00:14:04.642984153 +0000 UTC m=+498.762616406" watchObservedRunningTime="2026-03-08 00:14:04.644412571 +0000 UTC m=+498.764044794" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.068931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.069220 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.267280 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.267625 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.301035 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.635996 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" exitCode=0 Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.636087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224"} Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.637113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.637136 4713 scope.go:117] "RemoveContainer" containerID="ac199245af459acead4b5879445fc603296f72d27886545be5fc80257bd154fd" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.926242 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.989883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") pod \"4a8563b5-1794-4b14-b040-5694cafd63e8\" (UID: \"4a8563b5-1794-4b14-b040-5694cafd63e8\") " Mar 08 00:14:05 crc kubenswrapper[4713]: I0308 00:14:05.994444 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s" (OuterVolumeSpecName: "kube-api-access-lq57s") pod "4a8563b5-1794-4b14-b040-5694cafd63e8" (UID: "4a8563b5-1794-4b14-b040-5694cafd63e8"). InnerVolumeSpecName "kube-api-access-lq57s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.091556 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq57s\" (UniqueName: \"kubernetes.io/projected/4a8563b5-1794-4b14-b040-5694cafd63e8-kube-api-access-lq57s\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.109178 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rc7p9" podUID="dd52d225-2e7e-4958-98fc-52028b545353" containerName="registry-server" probeResult="failure" output=< Mar 08 00:14:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:14:06 crc kubenswrapper[4713]: > Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644749 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548814-v94cz" event={"ID":"4a8563b5-1794-4b14-b040-5694cafd63e8","Type":"ContainerDied","Data":"1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d"} Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644795 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1533f1cf7e1b1b910b3ae26450e9cc450f0feb0a1528cae746eb3fb3e80c274d" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.644774 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548814-v94cz" Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.699748 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:14:06 crc kubenswrapper[4713]: I0308 00:14:06.703457 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548808-nd57l"] Mar 08 00:14:07 crc kubenswrapper[4713]: I0308 00:14:07.472403 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:07 crc kubenswrapper[4713]: I0308 00:14:07.472692 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:08 crc kubenswrapper[4713]: I0308 00:14:08.511907 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4b75j" podUID="47027c84-0848-4140-bed0-b04f627cf6da" containerName="registry-server" probeResult="failure" output=< Mar 08 00:14:08 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:14:08 crc kubenswrapper[4713]: > Mar 08 00:14:08 crc kubenswrapper[4713]: I0308 00:14:08.549127 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdccd72c-79d7-4388-926e-0539c571dafe" path="/var/lib/kubelet/pods/fdccd72c-79d7-4388-926e-0539c571dafe/volumes" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.106412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.154750 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rc7p9" Mar 08 00:14:15 crc kubenswrapper[4713]: I0308 00:14:15.335696 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mn4rt" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.060960 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" containerID="cri-o://6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" gracePeriod=15 Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.467610 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493549 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493620 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493704 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493726 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493763 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493850 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493940 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.493969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494045 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") pod \"c9df8d9c-b59f-4a1c-9fb4-668123290569\" (UID: \"c9df8d9c-b59f-4a1c-9fb4-668123290569\") " Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494333 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.494913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495355 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.495895 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.509977 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.510413 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520639 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.520939 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.520975 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.520983 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521101 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerName="oauth-openshift" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521115 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" containerName="oc" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.521592 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527599 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527721 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.527920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528133 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.528957 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.529196 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.536321 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps" (OuterVolumeSpecName: "kube-api-access-mp6ps") pod "c9df8d9c-b59f-4a1c-9fb4-668123290569" (UID: "c9df8d9c-b59f-4a1c-9fb4-668123290569"). InnerVolumeSpecName "kube-api-access-mp6ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.578271 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4b75j" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595311 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595400 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595421 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595439 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595459 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.595730 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596109 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596246 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596321 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.596407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597071 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp6ps\" (UniqueName: \"kubernetes.io/projected/c9df8d9c-b59f-4a1c-9fb4-668123290569-kube-api-access-mp6ps\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597110 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597123 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597133 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597144 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597175 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597187 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597196 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597208 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597217 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597225 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597234 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597265 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.597275 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c9df8d9c-b59f-4a1c-9fb4-668123290569-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.698731 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699374 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699424 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-audit-policies\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.699991 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb9e6372-a327-41fd-8d17-662579df072a-audit-dir\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.700517 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.702947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.702981 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-session\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-login\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703266 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-error\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703676 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.703945 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.704640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.705122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bb9e6372-a327-41fd-8d17-662579df072a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.706887 4713 generic.go:334] "Generic (PLEG): container finished" podID="c9df8d9c-b59f-4a1c-9fb4-668123290569" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" exitCode=0 Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.706948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707045 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerDied","Data":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707170 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8gbn" event={"ID":"c9df8d9c-b59f-4a1c-9fb4-668123290569","Type":"ContainerDied","Data":"e0d410e7c38a223bcd0189e0430b8bd6e62ba561f8515070eac1a52a52fdb35d"} Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.707254 4713 scope.go:117] "RemoveContainer" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.716061 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdcc\" (UniqueName: \"kubernetes.io/projected/bb9e6372-a327-41fd-8d17-662579df072a-kube-api-access-7zdcc\") pod \"oauth-openshift-68f4889fd8-bwpcm\" (UID: \"bb9e6372-a327-41fd-8d17-662579df072a\") " pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.729176 4713 scope.go:117] "RemoveContainer" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: E0308 00:14:17.731501 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": container with ID starting with 6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d not found: ID does not exist" containerID="6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.731588 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d"} err="failed to get container status \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": rpc error: code = NotFound desc = could not find container \"6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d\": container with ID starting with 6182e807253ba09b176be3aa1eed3d59dbf32b0a321c8119cab78468705d4a0d not found: ID does not exist" Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.747557 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.750077 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8gbn"] Mar 08 00:14:17 crc kubenswrapper[4713]: I0308 00:14:17.879330 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.306725 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68f4889fd8-bwpcm"] Mar 08 00:14:18 crc kubenswrapper[4713]: W0308 00:14:18.312104 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb9e6372_a327_41fd_8d17_662579df072a.slice/crio-4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b WatchSource:0}: Error finding container 4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b: Status 404 returned error can't find the container with id 4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.548366 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9df8d9c-b59f-4a1c-9fb4-668123290569" path="/var/lib/kubelet/pods/c9df8d9c-b59f-4a1c-9fb4-668123290569/volumes" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.715638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" event={"ID":"bb9e6372-a327-41fd-8d17-662579df072a","Type":"ContainerStarted","Data":"a6e2c3505648e6e6b5ac48c44e2d593c92dff1d93421a5564c50f1e00b84de99"} Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.716598 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" event={"ID":"bb9e6372-a327-41fd-8d17-662579df072a","Type":"ContainerStarted","Data":"4f300ec69591808fce68228ca51b688fad8d967e01dbb656085cd2de5b20ba1b"} Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.716726 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:14:18 crc kubenswrapper[4713]: I0308 00:14:18.739621 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" podStartSLOduration=26.739598277 podStartE2EDuration="26.739598277s" podCreationTimestamp="2026-03-08 00:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:14:18.735121369 +0000 UTC m=+512.854753622" watchObservedRunningTime="2026-03-08 00:14:18.739598277 +0000 UTC m=+512.859230520" Mar 08 00:14:19 crc kubenswrapper[4713]: I0308 00:14:19.094019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68f4889fd8-bwpcm" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.151258 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.152632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.154295 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.154454 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.156996 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.204748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.305938 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.306805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.316484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.321570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"collect-profiles-29548815-v44m7\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.472898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.684677 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7"] Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937405 4713 generic.go:334] "Generic (PLEG): container finished" podID="4976d892-c6f5-417a-a992-72cf7e278170" containerID="b268d8626cb813a8937d020ece6a7ce9fef74733b7a185d9b285e8849e08f38b" exitCode=0 Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerDied","Data":"b268d8626cb813a8937d020ece6a7ce9fef74733b7a185d9b285e8849e08f38b"} Mar 08 00:15:00 crc kubenswrapper[4713]: I0308 00:15:00.937512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerStarted","Data":"b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122"} Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.143195 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329022 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") pod \"4976d892-c6f5-417a-a992-72cf7e278170\" (UID: \"4976d892-c6f5-417a-a992-72cf7e278170\") " Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.329919 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume" (OuterVolumeSpecName: "config-volume") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.334306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg" (OuterVolumeSpecName: "kube-api-access-xj8jg") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "kube-api-access-xj8jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.334341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4976d892-c6f5-417a-a992-72cf7e278170" (UID: "4976d892-c6f5-417a-a992-72cf7e278170"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431245 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4976d892-c6f5-417a-a992-72cf7e278170-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431301 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4976d892-c6f5-417a-a992-72cf7e278170-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.431335 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj8jg\" (UniqueName: \"kubernetes.io/projected/4976d892-c6f5-417a-a992-72cf7e278170-kube-api-access-xj8jg\") on node \"crc\" DevicePath \"\"" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" event={"ID":"4976d892-c6f5-417a-a992-72cf7e278170","Type":"ContainerDied","Data":"b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122"} Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953340 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b06978e80e0a298a51f6db841bc5e0f775b31e800b30bc160753a3eedce122" Mar 08 00:15:02 crc kubenswrapper[4713]: I0308 00:15:02.953289 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548815-v44m7" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.132669 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: E0308 00:16:00.133537 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.133555 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.133703 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4976d892-c6f5-417a-a992-72cf7e278170" containerName="collect-profiles" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.134167 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.137653 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.137998 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.138118 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.151396 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.228943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.330113 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.350138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"auto-csr-approver-29548816-gtsk5\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.459032 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.637987 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:16:00 crc kubenswrapper[4713]: I0308 00:16:00.649968 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:16:01 crc kubenswrapper[4713]: I0308 00:16:01.306665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerStarted","Data":"6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46"} Mar 08 00:16:03 crc kubenswrapper[4713]: I0308 00:16:03.323314 4713 generic.go:334] "Generic (PLEG): container finished" podID="e4623866-795f-438d-9b3b-66afb30f9657" containerID="88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e" exitCode=0 Mar 08 00:16:03 crc kubenswrapper[4713]: I0308 00:16:03.323374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerDied","Data":"88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e"} Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.500911 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.500970 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.523020 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.686245 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") pod \"e4623866-795f-438d-9b3b-66afb30f9657\" (UID: \"e4623866-795f-438d-9b3b-66afb30f9657\") " Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.697038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv" (OuterVolumeSpecName: "kube-api-access-654jv") pod "e4623866-795f-438d-9b3b-66afb30f9657" (UID: "e4623866-795f-438d-9b3b-66afb30f9657"). InnerVolumeSpecName "kube-api-access-654jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:16:04 crc kubenswrapper[4713]: I0308 00:16:04.788116 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-654jv\" (UniqueName: \"kubernetes.io/projected/e4623866-795f-438d-9b3b-66afb30f9657-kube-api-access-654jv\") on node \"crc\" DevicePath \"\"" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.336768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" event={"ID":"e4623866-795f-438d-9b3b-66afb30f9657","Type":"ContainerDied","Data":"6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46"} Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.337039 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5888737fbdd67a29e4d77fa22d161f97bf4a7024dd7077378a96e856992b46" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.336892 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548816-gtsk5" Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.574340 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:16:05 crc kubenswrapper[4713]: I0308 00:16:05.577997 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548810-lnmdz"] Mar 08 00:16:06 crc kubenswrapper[4713]: I0308 00:16:06.547731 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6470285d-4460-4c72-be17-00e880cc623d" path="/var/lib/kubelet/pods/6470285d-4460-4c72-be17-00e880cc623d/volumes" Mar 08 00:16:34 crc kubenswrapper[4713]: I0308 00:16:34.500582 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:16:34 crc kubenswrapper[4713]: I0308 00:16:34.501096 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.501208 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.501993 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502057 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502811 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.502949 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" gracePeriod=600 Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.682319 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" exitCode=0 Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.682437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596"} Mar 08 00:17:04 crc kubenswrapper[4713]: I0308 00:17:04.683341 4713 scope.go:117] "RemoveContainer" containerID="01a3ae60af94ae8d21eb3d737224225b18f319c8b266fff21272171a73177224" Mar 08 00:17:05 crc kubenswrapper[4713]: I0308 00:17:05.689301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} Mar 08 00:17:12 crc kubenswrapper[4713]: I0308 00:17:12.761142 4713 scope.go:117] "RemoveContainer" containerID="11992517ed2080bab72a9aa961669962e2daffa5f367346a3dc9ef9010cbb913" Mar 08 00:17:12 crc kubenswrapper[4713]: I0308 00:17:12.808075 4713 scope.go:117] "RemoveContainer" containerID="1cac5b889750a3972edc99367bdaaf3ef41e15813fd86b31ba34d9a937e3a2a1" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149051 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:00 crc kubenswrapper[4713]: E0308 00:18:00.149854 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149870 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.149975 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4623866-795f-438d-9b3b-66afb30f9657" containerName="oc" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.151112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.154443 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.154962 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.158653 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.163368 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.190579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.291931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.312875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"auto-csr-approver-29548818-c92cn\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.472068 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:00 crc kubenswrapper[4713]: I0308 00:18:00.677506 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:18:01 crc kubenswrapper[4713]: I0308 00:18:01.012011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerStarted","Data":"e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d"} Mar 08 00:18:03 crc kubenswrapper[4713]: I0308 00:18:03.023672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerDied","Data":"0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f"} Mar 08 00:18:03 crc kubenswrapper[4713]: I0308 00:18:03.024040 4713 generic.go:334] "Generic (PLEG): container finished" podID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerID="0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f" exitCode=0 Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.227730 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.235457 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") pod \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\" (UID: \"bbf256d4-02b4-46fd-86a1-793e34a17bf5\") " Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.243267 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf" (OuterVolumeSpecName: "kube-api-access-tv6wf") pod "bbf256d4-02b4-46fd-86a1-793e34a17bf5" (UID: "bbf256d4-02b4-46fd-86a1-793e34a17bf5"). InnerVolumeSpecName "kube-api-access-tv6wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:18:04 crc kubenswrapper[4713]: I0308 00:18:04.336486 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6wf\" (UniqueName: \"kubernetes.io/projected/bbf256d4-02b4-46fd-86a1-793e34a17bf5-kube-api-access-tv6wf\") on node \"crc\" DevicePath \"\"" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548818-c92cn" event={"ID":"bbf256d4-02b4-46fd-86a1-793e34a17bf5","Type":"ContainerDied","Data":"e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d"} Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036777 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27e7645df1ead5fd4aae04f4924dd88ade44b24c3da38f8427f022cb5a5d26d" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.036869 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548818-c92cn" Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.284099 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:18:05 crc kubenswrapper[4713]: I0308 00:18:05.287018 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548812-24fjw"] Mar 08 00:18:06 crc kubenswrapper[4713]: I0308 00:18:06.554985 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cdabef-a56e-45d2-8896-aab98bd84fb1" path="/var/lib/kubelet/pods/12cdabef-a56e-45d2-8896-aab98bd84fb1/volumes" Mar 08 00:18:12 crc kubenswrapper[4713]: I0308 00:18:12.866159 4713 scope.go:117] "RemoveContainer" containerID="71f869c9a3deae4099eb6a9e0da68e9d0801b114263bfc45efc59f3dae8002be" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.630193 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:56 crc kubenswrapper[4713]: E0308 00:18:56.631128 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631141 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631263 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" containerName="oc" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.631673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.648261 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725564 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725672 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.725739 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.750901 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826814 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826919 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826963 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.826992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.827651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ddda6293-48b1-4007-bb9c-b3657e684836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.828063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-registry-certificates\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.828986 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddda6293-48b1-4007-bb9c-b3657e684836-trusted-ca\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.833196 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-registry-tls\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.833265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ddda6293-48b1-4007-bb9c-b3657e684836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.843519 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-bound-sa-token\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.847591 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r78s\" (UniqueName: \"kubernetes.io/projected/ddda6293-48b1-4007-bb9c-b3657e684836-kube-api-access-2r78s\") pod \"image-registry-66df7c8f76-vh48p\" (UID: \"ddda6293-48b1-4007-bb9c-b3657e684836\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:56 crc kubenswrapper[4713]: I0308 00:18:56.948538 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.127524 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh48p"] Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" event={"ID":"ddda6293-48b1-4007-bb9c-b3657e684836","Type":"ContainerStarted","Data":"53f962bfd47cf0b0c18eb9485e287e2a19142df7872b87cbbf68ac0e7f60a938"} Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325582 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" event={"ID":"ddda6293-48b1-4007-bb9c-b3657e684836","Type":"ContainerStarted","Data":"64b8203ca59c865f4bfca95896c57cb2e4bd11333fc749708ca86b77a4f880cb"} Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.325687 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:18:57 crc kubenswrapper[4713]: I0308 00:18:57.349190 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" podStartSLOduration=1.349175606 podStartE2EDuration="1.349175606s" podCreationTimestamp="2026-03-08 00:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:18:57.349114954 +0000 UTC m=+791.468747197" watchObservedRunningTime="2026-03-08 00:18:57.349175606 +0000 UTC m=+791.468807839" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176057 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176867 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" containerID="cri-o://2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176886 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" containerID="cri-o://b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176996 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" containerID="cri-o://2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.177018 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" containerID="cri-o://4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176955 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.177023 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" containerID="cri-o://141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.176874 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" containerID="cri-o://8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.209464 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" containerID="cri-o://824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" gracePeriod=30 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.373017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovnkube-controller/3.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.379131 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.379702 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380084 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380167 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380221 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" exitCode=0 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380274 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" exitCode=143 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380331 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" exitCode=143 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380419 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380628 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380698 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.380783 4713 scope.go:117] "RemoveContainer" containerID="cb31afde520b617c338234c9c7384b57aaf2570f907b37ae0ab797b2dd901a2e" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382511 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382889 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/1.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382984 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" exitCode=2 Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.382996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerDied","Data":"393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222"} Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.383558 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.383905 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.479581 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fbba07_87e8_4e77_b834_ed68af718d11.slice/crio-conmon-8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.493605 4713 scope.go:117] "RemoveContainer" containerID="889d2148380bf677798262abdd95c84d2fd000431e7c34ae8b9e128afe19e86f" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.500886 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.501029 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.515567 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.516036 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.516484 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565075 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8g77c"] Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565281 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565292 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565301 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kubecfg-setup" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565307 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kubecfg-setup" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565314 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565320 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565328 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565334 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565341 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565347 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565359 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565365 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565373 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565380 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565387 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565394 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565404 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565409 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565416 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565422 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565429 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565434 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565441 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565447 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565539 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565548 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565554 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565562 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565569 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565577 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565586 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="northd" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565594 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="nbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565601 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565608 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovn-acl-logging" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565615 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="sbdb" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565622 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="kube-rbac-proxy-node" Mar 08 00:19:04 crc kubenswrapper[4713]: E0308 00:19:04.565706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.565712 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" containerName="ovnkube-controller" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.567228 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625932 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625947 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625956 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.625992 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket" (OuterVolumeSpecName: "log-socket") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626030 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log" (OuterVolumeSpecName: "node-log") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626089 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626130 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626400 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626423 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626465 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626562 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626577 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626593 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626629 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.626840 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627337 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627344 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627353 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627394 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") pod \"56fbba07-87e8-4e77-b834-ed68af718d11\" (UID: \"56fbba07-87e8-4e77-b834-ed68af718d11\") " Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash" (OuterVolumeSpecName: "host-slash") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627543 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627520 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627699 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627900 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.627972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628125 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628241 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628310 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628361 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628382 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628444 4713 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628458 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628473 4713 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628486 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628498 4713 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-log-socket\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628509 4713 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-node-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628520 4713 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628532 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628543 4713 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628554 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628565 4713 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628575 4713 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628586 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56fbba07-87e8-4e77-b834-ed68af718d11-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628597 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628608 4713 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628618 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.628628 4713 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-host-slash\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.631114 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.631306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z" (OuterVolumeSpecName: "kube-api-access-zl27z") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "kube-api-access-zl27z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.638057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "56fbba07-87e8-4e77-b834-ed68af718d11" (UID: "56fbba07-87e8-4e77-b834-ed68af718d11"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.729859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-node-log\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730273 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-slash\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730317 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730372 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730411 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-systemd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730415 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730441 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-systemd-units\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730457 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730534 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730576 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730607 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730628 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730712 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730777 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-netns\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-bin\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-etc-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-ovn\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730918 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-run-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730924 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-cni-netd\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-log-socket\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.730984 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731014 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-config\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-host-kubelet\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731489 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d4b1127-6d10-4c83-b3e9-f588af09812c-var-lib-openvswitch\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731570 4713 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56fbba07-87e8-4e77-b834-ed68af718d11-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731585 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl27z\" (UniqueName: \"kubernetes.io/projected/56fbba07-87e8-4e77-b834-ed68af718d11-kube-api-access-zl27z\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731597 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56fbba07-87e8-4e77-b834-ed68af718d11-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731762 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovnkube-script-lib\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.731908 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d4b1127-6d10-4c83-b3e9-f588af09812c-env-overrides\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.734187 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d4b1127-6d10-4c83-b3e9-f588af09812c-ovn-node-metrics-cert\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.745322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jf6\" (UniqueName: \"kubernetes.io/projected/1d4b1127-6d10-4c83-b3e9-f588af09812c-kube-api-access-d7jf6\") pod \"ovnkube-node-8g77c\" (UID: \"1d4b1127-6d10-4c83-b3e9-f588af09812c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: I0308 00:19:04.882150 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:04 crc kubenswrapper[4713]: W0308 00:19:04.904049 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4b1127_6d10_4c83_b3e9_f588af09812c.slice/crio-ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464 WatchSource:0}: Error finding container ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464: Status 404 returned error can't find the container with id ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.395765 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-acl-logging/0.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397171 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gsfft_56fbba07-87e8-4e77-b834-ed68af718d11/ovn-controller/0.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397478 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397567 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397629 4713 generic.go:334] "Generic (PLEG): container finished" podID="56fbba07-87e8-4e77-b834-ed68af718d11" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397634 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397574 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397972 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.397994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.398007 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gsfft" event={"ID":"56fbba07-87e8-4e77-b834-ed68af718d11","Type":"ContainerDied","Data":"6355753be9662030b1350e38ca6fc0620acd7ba140b99c59577d4d942dd0976d"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.398025 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400722 4713 generic.go:334] "Generic (PLEG): container finished" podID="1d4b1127-6d10-4c83-b3e9-f588af09812c" containerID="3e0a22bf48247677a94d418562e87f416f360a48b70ed912f3114a78b57c2d60" exitCode=0 Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400802 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerDied","Data":"3e0a22bf48247677a94d418562e87f416f360a48b70ed912f3114a78b57c2d60"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.400866 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"ef3205ca25ec388a1264999823542024a220c534e27dfac0241089821b86b464"} Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.405125 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.432039 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.462841 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.464715 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.467359 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gsfft"] Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.478167 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.498781 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.509195 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.520880 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.533774 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.557945 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576052 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.576489 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576537 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576563 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.576814 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576883 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.576901 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577156 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577177 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577194 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577641 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577672 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577693 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.577942 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577968 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.577985 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578237 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578264 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578282 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578540 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578563 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578581 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.578955 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.578983 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579000 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: E0308 00:19:05.579250 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579276 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579294 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579656 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.579682 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580129 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580163 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580648 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580681 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.580986 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581018 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581350 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581382 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581647 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.581682 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582067 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582100 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582374 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582406 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582723 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.582750 4713 scope.go:117] "RemoveContainer" containerID="824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583121 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d"} err="failed to get container status \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": rpc error: code = NotFound desc = could not find container \"824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d\": container with ID starting with 824e0153a9b4c4c467bc6d28369cbfcbedd7cca0a24e7311161600accad39f0d not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583149 4713 scope.go:117] "RemoveContainer" containerID="4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583409 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078"} err="failed to get container status \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": rpc error: code = NotFound desc = could not find container \"4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078\": container with ID starting with 4672ca49c4d903a1d7138a8cd9783499f0956065445269d27a76c90897a1d078 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583436 4713 scope.go:117] "RemoveContainer" containerID="2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583696 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b"} err="failed to get container status \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": rpc error: code = NotFound desc = could not find container \"2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b\": container with ID starting with 2ba88d85ef4e18f476899013f7748d639f735986a714d35287373b979ac82a1b not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583719 4713 scope.go:117] "RemoveContainer" containerID="8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583974 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864"} err="failed to get container status \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": rpc error: code = NotFound desc = could not find container \"8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864\": container with ID starting with 8cf44596f570045bddcf1ec0d8929dfd717620344972531f0b8d166140315864 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.583999 4713 scope.go:117] "RemoveContainer" containerID="dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584259 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93"} err="failed to get container status \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": rpc error: code = NotFound desc = could not find container \"dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93\": container with ID starting with dd3f0d485e6e5f097ec471b11f92527ad45d214f5ce6054d90f128ef56d11e93 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584280 4713 scope.go:117] "RemoveContainer" containerID="b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584483 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855"} err="failed to get container status \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": rpc error: code = NotFound desc = could not find container \"b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855\": container with ID starting with b06fefd238fd82bdc3346bea11b852955abb4dc45df725cff9f673ab75dd0855 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584505 4713 scope.go:117] "RemoveContainer" containerID="2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584755 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43"} err="failed to get container status \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": rpc error: code = NotFound desc = could not find container \"2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43\": container with ID starting with 2b0d080cad09c742c259267eade0524e0604875a8dc2e86ca9dbd1f38eea1f43 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584780 4713 scope.go:117] "RemoveContainer" containerID="141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.584995 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0"} err="failed to get container status \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": rpc error: code = NotFound desc = could not find container \"141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0\": container with ID starting with 141c4ea251fe9fba4839ca86090006c44a23fe0c0167cfee0995e834e39634d0 not found: ID does not exist" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.585019 4713 scope.go:117] "RemoveContainer" containerID="13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d" Mar 08 00:19:05 crc kubenswrapper[4713]: I0308 00:19:05.585351 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d"} err="failed to get container status \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": rpc error: code = NotFound desc = could not find container \"13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d\": container with ID starting with 13edcd5e41775d848681af8502e2bf58944ec4535d09586d8fa3d5327febb09d not found: ID does not exist" Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7901a062dea925d54a34042d1f82694290b94ca627c557a0fd9af9e433a01a97"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"69117129efda018065e7176231b21d798a0439111c11a6c53ecae2d7c8adbebe"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"c5746d707fcc3714d3fb41ba9ae86870afb570bb3db246f11b923d439a992674"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7d35667920095de84c60802ce5f061f2ba8155950a8007ea8212448a4d4368cc"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"3308cba4d6d172163bf7dbe7e2ef98f12fbc51546d7f4a161d8b6e99740e1b2a"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.415733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"7cf9283a95da08ae58f85f219e102e1918af08f88130b0effa8d4396cd928086"} Mar 08 00:19:06 crc kubenswrapper[4713]: I0308 00:19:06.547892 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fbba07-87e8-4e77-b834-ed68af718d11" path="/var/lib/kubelet/pods/56fbba07-87e8-4e77-b834-ed68af718d11/volumes" Mar 08 00:19:08 crc kubenswrapper[4713]: I0308 00:19:08.428559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"25b677aaa77329ac51c033fd2d56c3625249138ad984ae7e49707909ba0514ca"} Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" event={"ID":"1d4b1127-6d10-4c83-b3e9-f588af09812c","Type":"ContainerStarted","Data":"e0ccd78bd4e9bea221c0d60a3b046309bf5139ba8beb597d15579b79e5d4fb16"} Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443974 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.443993 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.444006 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.466578 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.468924 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:10 crc kubenswrapper[4713]: I0308 00:19:10.473079 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" podStartSLOduration=6.473061281 podStartE2EDuration="6.473061281s" podCreationTimestamp="2026-03-08 00:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:19:10.471030978 +0000 UTC m=+804.590663231" watchObservedRunningTime="2026-03-08 00:19:10.473061281 +0000 UTC m=+804.592693524" Mar 08 00:19:12 crc kubenswrapper[4713]: I0308 00:19:12.922059 4713 scope.go:117] "RemoveContainer" containerID="a5ad4469ff836c615e5b2bcb96b4fe9efd7c80eb9a37dbbbc54e3aa236361f04" Mar 08 00:19:16 crc kubenswrapper[4713]: I0308 00:19:16.954201 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vh48p" Mar 08 00:19:17 crc kubenswrapper[4713]: I0308 00:19:17.003336 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:19 crc kubenswrapper[4713]: I0308 00:19:19.540300 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:19 crc kubenswrapper[4713]: E0308 00:19:19.540494 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fh96f_openshift-multus(bf95e3f7-808b-434f-8fd4-c7e7365a1561)\"" pod="openshift-multus/multus-fh96f" podUID="bf95e3f7-808b-434f-8fd4-c7e7365a1561" Mar 08 00:19:30 crc kubenswrapper[4713]: I0308 00:19:30.540812 4713 scope.go:117] "RemoveContainer" containerID="393edc0643830d2b79626badd9377f827d4c6be3099c83edaa7aaf6132513222" Mar 08 00:19:31 crc kubenswrapper[4713]: I0308 00:19:31.556491 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fh96f_bf95e3f7-808b-434f-8fd4-c7e7365a1561/kube-multus/2.log" Mar 08 00:19:31 crc kubenswrapper[4713]: I0308 00:19:31.557120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fh96f" event={"ID":"bf95e3f7-808b-434f-8fd4-c7e7365a1561","Type":"ContainerStarted","Data":"4ba8c147465404e7712fc0edbf400ab1fea985cebc5927beacab6ccd5020b59c"} Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.501166 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.501235 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:19:34 crc kubenswrapper[4713]: I0308 00:19:34.909724 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8g77c" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.043578 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" containerID="cri-o://93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" gracePeriod=30 Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.485461 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546672 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546704 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546933 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.546962 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547052 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") pod \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\" (UID: \"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9\") " Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.547739 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.549173 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.552120 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.552530 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw" (OuterVolumeSpecName: "kube-api-access-gk5fw") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "kube-api-access-gk5fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.554480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.554870 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.559095 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.562931 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" (UID: "68a8aac8-a3d8-45c3-a4f2-6420f4740ac9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.610951 4713 generic.go:334] "Generic (PLEG): container finished" podID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" exitCode=0 Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.610990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerDied","Data":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" event={"ID":"68a8aac8-a3d8-45c3-a4f2-6420f4740ac9","Type":"ContainerDied","Data":"bb5ac4f2b836df6ac588ac8b2f666d14dde9ba8adb7944edc138fe1ed9464c9d"} Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611023 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bnx6n" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.611029 4713 scope.go:117] "RemoveContainer" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.628697 4713 scope.go:117] "RemoveContainer" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: E0308 00:19:42.629318 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": container with ID starting with 93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff not found: ID does not exist" containerID="93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.629359 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff"} err="failed to get container status \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": rpc error: code = NotFound desc = could not find container \"93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff\": container with ID starting with 93cc0fcd69abc860cf55312dc82c20ddffc56cc57377b335880d3a97133a4aff not found: ID does not exist" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.639367 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.643726 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bnx6n"] Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648651 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648683 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648707 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648717 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648730 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648745 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk5fw\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-kube-api-access-gk5fw\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:42 crc kubenswrapper[4713]: I0308 00:19:42.648756 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:44 crc kubenswrapper[4713]: I0308 00:19:44.546961 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" path="/var/lib/kubelet/pods/68a8aac8-a3d8-45c3-a4f2-6420f4740ac9/volumes" Mar 08 00:19:47 crc kubenswrapper[4713]: I0308 00:19:47.548151 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.125063 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.125777 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4m4tz" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" containerID="cri-o://23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" gracePeriod=30 Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.489800 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625441 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625523 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.625549 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") pod \"cb44436e-472b-4a5f-8ff6-06242535e835\" (UID: \"cb44436e-472b-4a5f-8ff6-06242535e835\") " Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.627138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities" (OuterVolumeSpecName: "utilities") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.631356 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5" (OuterVolumeSpecName: "kube-api-access-mrdd5") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "kube-api-access-mrdd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649465 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb44436e-472b-4a5f-8ff6-06242535e835" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" exitCode=0 Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649513 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4m4tz" event={"ID":"cb44436e-472b-4a5f-8ff6-06242535e835","Type":"ContainerDied","Data":"872b442fcf53dc350c20c113c6415793cd135f6045c9203dc5387eb2fa9f45e6"} Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649580 4713 scope.go:117] "RemoveContainer" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.649526 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4m4tz" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.652536 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb44436e-472b-4a5f-8ff6-06242535e835" (UID: "cb44436e-472b-4a5f-8ff6-06242535e835"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.665251 4713 scope.go:117] "RemoveContainer" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.677704 4713 scope.go:117] "RemoveContainer" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.692892 4713 scope.go:117] "RemoveContainer" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.693432 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": container with ID starting with 23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904 not found: ID does not exist" containerID="23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693459 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904"} err="failed to get container status \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": rpc error: code = NotFound desc = could not find container \"23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904\": container with ID starting with 23f67ec69a4a599e171c3976b9fd0c7695c610c82963361204cfa2656c4fa904 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693479 4713 scope.go:117] "RemoveContainer" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.693849 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": container with ID starting with dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87 not found: ID does not exist" containerID="dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693898 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87"} err="failed to get container status \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": rpc error: code = NotFound desc = could not find container \"dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87\": container with ID starting with dab489fb584fb93c45f36cb3360d36facce6eecc130f0b5f47a63f807f173b87 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.693929 4713 scope.go:117] "RemoveContainer" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: E0308 00:19:49.694209 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": container with ID starting with b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6 not found: ID does not exist" containerID="b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.694237 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6"} err="failed to get container status \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": rpc error: code = NotFound desc = could not find container \"b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6\": container with ID starting with b18b6fc6465b4e2a4cd841bf129ddc17aa0ded5adc8dab1c2e2a29bd980417c6 not found: ID does not exist" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727445 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727491 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdd5\" (UniqueName: \"kubernetes.io/projected/cb44436e-472b-4a5f-8ff6-06242535e835-kube-api-access-mrdd5\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.727504 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb44436e-472b-4a5f-8ff6-06242535e835-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.990295 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:49 crc kubenswrapper[4713]: I0308 00:19:49.998512 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4m4tz"] Mar 08 00:19:50 crc kubenswrapper[4713]: I0308 00:19:50.548891 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" path="/var/lib/kubelet/pods/cb44436e-472b-4a5f-8ff6-06242535e835/volumes" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680111 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-utilities" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680339 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-utilities" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680353 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680359 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680373 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680379 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: E0308 00:19:52.680390 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-content" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="extract-content" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680490 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a8aac8-a3d8-45c3-a4f2-6420f4740ac9" containerName="registry" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.680502 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb44436e-472b-4a5f-8ff6-06242535e835" containerName="registry-server" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.681258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.684396 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.691664 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860141 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860230 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.860602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962181 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.962771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:52 crc kubenswrapper[4713]: I0308 00:19:52.982147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.002739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.395450 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p"] Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.670670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerStarted","Data":"1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963"} Mar 08 00:19:53 crc kubenswrapper[4713]: I0308 00:19:53.671033 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerStarted","Data":"4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d"} Mar 08 00:19:54 crc kubenswrapper[4713]: I0308 00:19:54.676623 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963" exitCode=0 Mar 08 00:19:54 crc kubenswrapper[4713]: I0308 00:19:54.676676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"1ce1a1ce20772862ea12be0992aae2cea312d04841ec72c6ac661ab992251963"} Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.850436 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.856073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:55 crc kubenswrapper[4713]: I0308 00:19:55.856502 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001098 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001203 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.001743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.103623 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.103965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.104241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.104606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.105052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.125028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"redhat-operators-z6sch\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.179127 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.377160 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:19:56 crc kubenswrapper[4713]: W0308 00:19:56.378726 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeebc8d8_7e37_468b_a3b9_4ef9e73afb7a.slice/crio-3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3 WatchSource:0}: Error finding container 3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3: Status 404 returned error can't find the container with id 3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.690320 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="a736c4ba1de9eee3e4e1fba600b72037c5c4ae6b13a53129cedc82690a0bf9d4" exitCode=0 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.690534 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"a736c4ba1de9eee3e4e1fba600b72037c5c4ae6b13a53129cedc82690a0bf9d4"} Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692735 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" exitCode=0 Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361"} Mar 08 00:19:56 crc kubenswrapper[4713]: I0308 00:19:56.692781 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3"} Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.700675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.703079 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerID="ecc5a233466087ba46cc571d3010af15eff315f61d103d413f967cc98b050e7f" exitCode=0 Mar 08 00:19:57 crc kubenswrapper[4713]: I0308 00:19:57.703115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"ecc5a233466087ba46cc571d3010af15eff315f61d103d413f967cc98b050e7f"} Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.709730 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" exitCode=0 Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.709860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} Mar 08 00:19:58 crc kubenswrapper[4713]: I0308 00:19:58.923330 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037601 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.037761 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") pod \"9a95188d-5e62-49d4-851d-08195ed98f4d\" (UID: \"9a95188d-5e62-49d4-851d-08195ed98f4d\") " Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.040432 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle" (OuterVolumeSpecName: "bundle") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.043982 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4" (OuterVolumeSpecName: "kube-api-access-gvwt4") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "kube-api-access-gvwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.138912 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvwt4\" (UniqueName: \"kubernetes.io/projected/9a95188d-5e62-49d4-851d-08195ed98f4d-kube-api-access-gvwt4\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.138946 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.231591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util" (OuterVolumeSpecName: "util") pod "9a95188d-5e62-49d4-851d-08195ed98f4d" (UID: "9a95188d-5e62-49d4-851d-08195ed98f4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.239860 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a95188d-5e62-49d4-851d-08195ed98f4d-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732065 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p" event={"ID":"9a95188d-5e62-49d4-851d-08195ed98f4d","Type":"ContainerDied","Data":"4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d"} Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.732529 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c86f2a4f6779fa3607ffb13f24034e849d61c6237e1b98867fba5b237c59d0d" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.735152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerStarted","Data":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.757250 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6sch" podStartSLOduration=2.36665676 podStartE2EDuration="4.757230795s" podCreationTimestamp="2026-03-08 00:19:55 +0000 UTC" firstStartedPulling="2026-03-08 00:19:56.694166255 +0000 UTC m=+850.813798488" lastFinishedPulling="2026-03-08 00:19:59.08474029 +0000 UTC m=+853.204372523" observedRunningTime="2026-03-08 00:19:59.754281198 +0000 UTC m=+853.873913451" watchObservedRunningTime="2026-03-08 00:19:59.757230795 +0000 UTC m=+853.876863028" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886194 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886411 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886431 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886449 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="pull" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886458 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="pull" Mar 08 00:19:59 crc kubenswrapper[4713]: E0308 00:19:59.886472 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="util" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886481 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="util" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.886618 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a95188d-5e62-49d4-851d-08195ed98f4d" containerName="extract" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.887446 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.889900 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.899633 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.947331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:19:59 crc kubenswrapper[4713]: I0308 00:19:59.947482 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.048982 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049096 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049795 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.049929 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.134444 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.135609 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137301 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137445 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.137842 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.140960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.151426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.152252 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.169076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.244176 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.254139 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.282475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"auto-csr-approver-29548820-cts7b\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.426377 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.452740 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.615565 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:20:00 crc kubenswrapper[4713]: W0308 00:20:00.623157 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c62a3d3_0f8a_40d6_a2f0_b860e9c85085.slice/crio-90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9 WatchSource:0}: Error finding container 90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9: Status 404 returned error can't find the container with id 90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9 Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.740864 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerStarted","Data":"90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742557 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="186c363db3b3f5848bf217802d858e513ea39f3d481d7f645c52991e2dbdc59e" exitCode=0 Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"186c363db3b3f5848bf217802d858e513ea39f3d481d7f645c52991e2dbdc59e"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.742681 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da"} Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.889143 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.891333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:00 crc kubenswrapper[4713]: I0308 00:20:00.900612 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063168 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.063526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165081 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165547 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.165669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.184085 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.215649 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.394945 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt"] Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750133 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="3fbf74e5fa454b583c7cbbe45cb691fc6bd2392bfaf1d1ffec1a8bc6f6b3cef6" exitCode=0 Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750196 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"3fbf74e5fa454b583c7cbbe45cb691fc6bd2392bfaf1d1ffec1a8bc6f6b3cef6"} Mar 08 00:20:01 crc kubenswrapper[4713]: I0308 00:20:01.750519 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.761927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.763866 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerStarted","Data":"f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.765468 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37"} Mar 08 00:20:03 crc kubenswrapper[4713]: I0308 00:20:03.803144 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548820-cts7b" podStartSLOduration=1.156346864 podStartE2EDuration="3.803125501s" podCreationTimestamp="2026-03-08 00:20:00 +0000 UTC" firstStartedPulling="2026-03-08 00:20:00.625168307 +0000 UTC m=+854.744800540" lastFinishedPulling="2026-03-08 00:20:03.271946944 +0000 UTC m=+857.391579177" observedRunningTime="2026-03-08 00:20:03.800390179 +0000 UTC m=+857.920022422" watchObservedRunningTime="2026-03-08 00:20:03.803125501 +0000 UTC m=+857.922757744" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501518 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501576 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.501620 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.502174 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.502228 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" gracePeriod=600 Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.771707 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerID="f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe" exitCode=0 Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.771760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerDied","Data":"f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe"} Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.854092 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.855398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:04 crc kubenswrapper[4713]: I0308 00:20:04.912812 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.009684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.009729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.010205 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112410 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.112443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.113044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.113563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.139388 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"certified-operators-75hx9\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.249185 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.721018 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.781043 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.782149 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"1efd9ebff4e293a83cbc2d4395c90416eff2427e8ccd4ac0c53f176f5ead001b"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797155 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.797255 4713 scope.go:117] "RemoveContainer" containerID="04ebfc2302b56f8bb12a70d64fc021a3b048e8c595c42bd1150e283caea23596" Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.811241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"bbb7c668e198fab933a09095559493804adf46dd60ac7836615cd7c4aef891ab"} Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.827887 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37" exitCode=0 Mar 08 00:20:05 crc kubenswrapper[4713]: I0308 00:20:05.828073 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"27a8018254bcdc95999268b684b7c8eefdd283d285e194688d9f530abdc16e37"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.180665 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.180731 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.270191 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.287194 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.426612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") pod \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\" (UID: \"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085\") " Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.443170 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv" (OuterVolumeSpecName: "kube-api-access-ggxxv") pod "8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" (UID: "8c62a3d3-0f8a-40d6-a2f0-b860e9c85085"). InnerVolumeSpecName "kube-api-access-ggxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.527615 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggxxv\" (UniqueName: \"kubernetes.io/projected/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085-kube-api-access-ggxxv\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.835214 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerStarted","Data":"8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.837624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.839185 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f" exitCode=0 Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.839242 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548820-cts7b" event={"ID":"8c62a3d3-0f8a-40d6-a2f0-b860e9c85085","Type":"ContainerDied","Data":"90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842511 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548820-cts7b" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.842520 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b2ee71afbae434b1f5aebbc1de220ec9caec4f6f505e2a5b130e83d7ed85e9" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.848126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerStarted","Data":"40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f"} Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.893591 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" podStartSLOduration=5.365456665 podStartE2EDuration="7.893573559s" podCreationTimestamp="2026-03-08 00:19:59 +0000 UTC" firstStartedPulling="2026-03-08 00:20:00.744219521 +0000 UTC m=+854.863851754" lastFinishedPulling="2026-03-08 00:20:03.272336415 +0000 UTC m=+857.391968648" observedRunningTime="2026-03-08 00:20:06.87417442 +0000 UTC m=+860.993806663" watchObservedRunningTime="2026-03-08 00:20:06.893573559 +0000 UTC m=+861.013205792" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.927538 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" podStartSLOduration=5.407772094 podStartE2EDuration="6.927517799s" podCreationTimestamp="2026-03-08 00:20:00 +0000 UTC" firstStartedPulling="2026-03-08 00:20:01.751359357 +0000 UTC m=+855.870991600" lastFinishedPulling="2026-03-08 00:20:03.271105072 +0000 UTC m=+857.390737305" observedRunningTime="2026-03-08 00:20:06.923960716 +0000 UTC m=+861.043592949" watchObservedRunningTime="2026-03-08 00:20:06.927517799 +0000 UTC m=+861.047150022" Mar 08 00:20:06 crc kubenswrapper[4713]: I0308 00:20:06.935503 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.365353 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.373957 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548814-v94cz"] Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.854940 4713 generic.go:334] "Generic (PLEG): container finished" podID="54dbca74-9530-4327-8ede-124dc50096cf" containerID="40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f" exitCode=0 Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.855254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"40e345edc7e613ecc12357e373e3bd98dff211d5631b1d57b9dcc6475a9fad5f"} Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.856989 4713 generic.go:334] "Generic (PLEG): container finished" podID="82947b22-2505-49f0-94e0-039a1a219656" containerID="8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0" exitCode=0 Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.857022 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"8eeddca99c72da75088d7692b5518a91502014098929856d7dd903dc4f2249d0"} Mar 08 00:20:07 crc kubenswrapper[4713]: I0308 00:20:07.859696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390"} Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.429722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:08 crc kubenswrapper[4713]: E0308 00:20:08.429937 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.429949 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.430086 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" containerName="oc" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.430759 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.498058 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.547387 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8563b5-1794-4b14-b040-5694cafd63e8" path="/var/lib/kubelet/pods/4a8563b5-1794-4b14-b040-5694cafd63e8/volumes" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.551967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.552050 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.552127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653435 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.653846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.654559 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.689856 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:08 crc kubenswrapper[4713]: I0308 00:20:08.764190 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.441540 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.450404 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571218 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571239 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571261 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") pod \"82947b22-2505-49f0-94e0-039a1a219656\" (UID: \"82947b22-2505-49f0-94e0-039a1a219656\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571304 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.571340 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") pod \"54dbca74-9530-4327-8ede-124dc50096cf\" (UID: \"54dbca74-9530-4327-8ede-124dc50096cf\") " Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.572158 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle" (OuterVolumeSpecName: "bundle") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.573354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle" (OuterVolumeSpecName: "bundle") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.579273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm" (OuterVolumeSpecName: "kube-api-access-ncqrm") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "kube-api-access-ncqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.581268 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk" (OuterVolumeSpecName: "kube-api-access-9gqdk") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "kube-api-access-9gqdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.591957 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util" (OuterVolumeSpecName: "util") pod "54dbca74-9530-4327-8ede-124dc50096cf" (UID: "54dbca74-9530-4327-8ede-124dc50096cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.593682 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util" (OuterVolumeSpecName: "util") pod "82947b22-2505-49f0-94e0-039a1a219656" (UID: "82947b22-2505-49f0-94e0-039a1a219656"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.616568 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw"] Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672406 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqdk\" (UniqueName: \"kubernetes.io/projected/82947b22-2505-49f0-94e0-039a1a219656-kube-api-access-9gqdk\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672447 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672460 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672473 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82947b22-2505-49f0-94e0-039a1a219656-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672485 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncqrm\" (UniqueName: \"kubernetes.io/projected/54dbca74-9530-4327-8ede-124dc50096cf-kube-api-access-ncqrm\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.672498 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54dbca74-9530-4327-8ede-124dc50096cf-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876234 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" event={"ID":"54dbca74-9530-4327-8ede-124dc50096cf","Type":"ContainerDied","Data":"808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876560 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808e9e7480a81eee4107c02dd7bdc5469952f29631ae9e10215a3f95deab1629" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.876295 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.878133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerStarted","Data":"fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.878172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerStarted","Data":"8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" event={"ID":"82947b22-2505-49f0-94e0-039a1a219656","Type":"ContainerDied","Data":"5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da"} Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880491 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5863e3fb49cb0abd48c8e4b772dd331e5b10c077db31a598dfa94396300dc6da" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.880491 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p" Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.882568 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390" exitCode=0 Mar 08 00:20:09 crc kubenswrapper[4713]: I0308 00:20:09.882619 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.878714 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879388 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879410 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879432 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879440 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="pull" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879450 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879457 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879470 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879476 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879485 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879492 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: E0308 00:20:10.879502 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879507 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="util" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879598 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="54dbca74-9530-4327-8ede-124dc50096cf" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.879612 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="82947b22-2505-49f0-94e0-039a1a219656" containerName="extract" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.880145 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.881744 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.882131 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hxq7b" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.882254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889347 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a" exitCode=0 Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"fcd5a63406e47a9ca5e740a3b76dadd13920b5c3ffc7dd0be1ebb90e3737ab3a"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.889639 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.892672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerStarted","Data":"44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11"} Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.989017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:10 crc kubenswrapper[4713]: I0308 00:20:10.993544 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-75hx9" podStartSLOduration=3.353857585 podStartE2EDuration="6.993521132s" podCreationTimestamp="2026-03-08 00:20:04 +0000 UTC" firstStartedPulling="2026-03-08 00:20:06.840454015 +0000 UTC m=+860.960086248" lastFinishedPulling="2026-03-08 00:20:10.480117562 +0000 UTC m=+864.599749795" observedRunningTime="2026-03-08 00:20:10.983934281 +0000 UTC m=+865.103566534" watchObservedRunningTime="2026-03-08 00:20:10.993521132 +0000 UTC m=+865.113153365" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.039792 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.040612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.044284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.045081 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qsrk4" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.051721 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.063051 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.063902 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.087606 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.091019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.131876 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvbs\" (UniqueName: \"kubernetes.io/projected/1f48c701-2464-42f6-b2d7-c851ae965f1b-kube-api-access-6nvbs\") pod \"obo-prometheus-operator-68bc856cb9-4z5hw\" (UID: \"1f48c701-2464-42f6-b2d7-c851ae965f1b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192193 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192250 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.192421 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.197105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.201722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.202726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.206339 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.206615 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dswwm" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.246805 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.247045 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6sch" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" containerID="cri-o://6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" gracePeriod=2 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.264779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.293925 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.293974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294016 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294056 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294075 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.294101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.297578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.301405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2152c14-6da7-4f74-a30e-da9e4e7c1acc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk\" (UID: \"e2152c14-6da7-4f74-a30e-da9e4e7c1acc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.301405 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.312587 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/860dc604-80d3-4d4b-8b1e-8a430b706882-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5\" (UID: \"860dc604-80d3-4d4b-8b1e-8a430b706882\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.359093 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.386592 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.394784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.394890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.401804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f559f6d0-89dc-4d38-807f-491671408dc7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.414758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dtm6\" (UniqueName: \"kubernetes.io/projected/f559f6d0-89dc-4d38-807f-491671408dc7-kube-api-access-2dtm6\") pod \"observability-operator-59bdc8b94-v4h4x\" (UID: \"f559f6d0-89dc-4d38-807f-491671408dc7\") " pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.438627 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.441995 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.444213 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vhpxz" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.450655 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.502396 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.566595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.599084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.599152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.699838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.700185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.701534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d1a0596-7485-4376-9630-688753a7abd7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.718206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.727803 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m68qd\" (UniqueName: \"kubernetes.io/projected/3d1a0596-7485-4376-9630-688753a7abd7-kube-api-access-m68qd\") pod \"perses-operator-5bf474d74f-tw72p\" (UID: \"3d1a0596-7485-4376-9630-688753a7abd7\") " pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.769279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk"] Mar 08 00:20:11 crc kubenswrapper[4713]: W0308 00:20:11.779684 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2152c14_6da7_4f74_a30e_da9e4e7c1acc.slice/crio-9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180 WatchSource:0}: Error finding container 9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180: Status 404 returned error can't find the container with id 9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.789023 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.800881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.800970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.801015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") pod \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\" (UID: \"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a\") " Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.805405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx" (OuterVolumeSpecName: "kube-api-access-f2btx") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "kube-api-access-f2btx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.820435 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities" (OuterVolumeSpecName: "utilities") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.893542 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.900978 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v4h4x"] Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.903601 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.903633 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2btx\" (UniqueName: \"kubernetes.io/projected/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-kube-api-access-f2btx\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:11 crc kubenswrapper[4713]: W0308 00:20:11.910938 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860dc604_80d3_4d4b_8b1e_8a430b706882.slice/crio-a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd WatchSource:0}: Error finding container a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd: Status 404 returned error can't find the container with id a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915777 4713 generic.go:334] "Generic (PLEG): container finished" podID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" exitCode=0 Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915934 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6sch" event={"ID":"deebc8d8-7e37-468b-a3b9-4ef9e73afb7a","Type":"ContainerDied","Data":"3f5dc039938ae0039619e3673f0d3e74ed91954352f20a12f6e9005ffaa413a3"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.915957 4713 scope.go:117] "RemoveContainer" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.916143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6sch" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.933583 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" event={"ID":"e2152c14-6da7-4f74-a30e-da9e4e7c1acc","Type":"ContainerStarted","Data":"9459eb0cccba4853fe570acaa118e50e3638489bf24957a5208bce1321878180"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.939043 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" event={"ID":"1f48c701-2464-42f6-b2d7-c851ae965f1b","Type":"ContainerStarted","Data":"a399e931a76cf9a50eb6862305dddec7ffe6e8c7ec95be07abefa57e3108aaf6"} Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.954635 4713 scope.go:117] "RemoveContainer" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.955008 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" (UID: "deebc8d8-7e37-468b-a3b9-4ef9e73afb7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:11 crc kubenswrapper[4713]: I0308 00:20:11.986907 4713 scope.go:117] "RemoveContainer" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.004368 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.023002 4713 scope.go:117] "RemoveContainer" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024144 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": container with ID starting with 6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e not found: ID does not exist" containerID="6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024179 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e"} err="failed to get container status \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": rpc error: code = NotFound desc = could not find container \"6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e\": container with ID starting with 6033e578d21856f494ab38ca348bc0a1d9f9267385dd514c6d3a55d74ab8847e not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024204 4713 scope.go:117] "RemoveContainer" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024653 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": container with ID starting with 8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8 not found: ID does not exist" containerID="8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024678 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8"} err="failed to get container status \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": rpc error: code = NotFound desc = could not find container \"8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8\": container with ID starting with 8bac1d74838606ee4bfa04c4b9838c6c0bf83c1ac059419f147c5375fea2a1d8 not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024695 4713 scope.go:117] "RemoveContainer" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: E0308 00:20:12.024960 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": container with ID starting with 484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361 not found: ID does not exist" containerID="484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.024980 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361"} err="failed to get container status \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": rpc error: code = NotFound desc = could not find container \"484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361\": container with ID starting with 484e97f172ed4466c9f0c5c9bef702dc82ce8b64ec4b2a02f887d02e4cd3c361 not found: ID does not exist" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.072058 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tw72p"] Mar 08 00:20:12 crc kubenswrapper[4713]: W0308 00:20:12.089479 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1a0596_7485_4376_9630_688753a7abd7.slice/crio-98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0 WatchSource:0}: Error finding container 98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0: Status 404 returned error can't find the container with id 98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0 Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.248271 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.252685 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6sch"] Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.565250 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" path="/var/lib/kubelet/pods/deebc8d8-7e37-468b-a3b9-4ef9e73afb7a/volumes" Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.947194 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" event={"ID":"3d1a0596-7485-4376-9630-688753a7abd7","Type":"ContainerStarted","Data":"98d4a9fe55a0de57d2470fcfc5445686d40184de1623edfb37b1198353c571a0"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.953109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" event={"ID":"860dc604-80d3-4d4b-8b1e-8a430b706882","Type":"ContainerStarted","Data":"a0f06089fb9523f49bd2ed80109c3c6a23905db25659ad2b7227c376283dc8dd"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.954334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" event={"ID":"f559f6d0-89dc-4d38-807f-491671408dc7","Type":"ContainerStarted","Data":"e8f02f5581939087fbccea10db90ae57c616fe33ee236dcbc9c2e6165ecfd6ff"} Mar 08 00:20:12 crc kubenswrapper[4713]: I0308 00:20:12.963990 4713 scope.go:117] "RemoveContainer" containerID="dfa43747f3bb6e5dbf06700a034e142c0a3b9f2938aaade963ddcb6f4fd3fb53" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.257603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.258732 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:15 crc kubenswrapper[4713]: I0308 00:20:15.360537 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.081148 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564472 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564743 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-content" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-content" Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564781 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564789 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: E0308 00:20:16.564807 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-utilities" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564816 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="extract-utilities" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.564955 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="deebc8d8-7e37-468b-a3b9-4ef9e73afb7a" containerName="registry-server" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.565436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.573878 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574005 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574076 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.574029 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-f4ckr" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.581699 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.675950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.675986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.676053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777305 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.777333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.786261 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-apiservice-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.797909 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5a74652-f05c-47a0-8caa-77f544c95128-webhook-cert\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.837246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxz9t\" (UniqueName: \"kubernetes.io/projected/e5a74652-f05c-47a0-8caa-77f544c95128-kube-api-access-qxz9t\") pod \"elastic-operator-59b484cb78-hfzmx\" (UID: \"e5a74652-f05c-47a0-8caa-77f544c95128\") " pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:16 crc kubenswrapper[4713]: I0308 00:20:16.891813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" Mar 08 00:20:17 crc kubenswrapper[4713]: I0308 00:20:17.032739 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:19 crc kubenswrapper[4713]: I0308 00:20:19.008259 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-75hx9" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" containerID="cri-o://44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" gracePeriod=2 Mar 08 00:20:20 crc kubenswrapper[4713]: I0308 00:20:20.017176 4713 generic.go:334] "Generic (PLEG): container finished" podID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" exitCode=0 Mar 08 00:20:20 crc kubenswrapper[4713]: I0308 00:20:20.017255 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11"} Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.010027 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.010717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.012465 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-pcr9h" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.020743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.150860 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.251627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.273903 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/37b64282-4957-4a04-b1be-6d3184bfdd25-kube-api-access-s52x9\") pod \"interconnect-operator-5bb49f789d-qt8kz\" (UID: \"37b64282-4957-4a04-b1be-6d3184bfdd25\") " pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:21 crc kubenswrapper[4713]: I0308 00:20:21.325989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.250234 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251444 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251741 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 00:20:25 crc kubenswrapper[4713]: E0308 00:20:25.251770 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-75hx9" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.976575 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.977051 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m68qd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-tw72p_openshift-operators(3d1a0596-7485-4376-9630-688753a7abd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:27 crc kubenswrapper[4713]: E0308 00:20:27.978228 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podUID="3d1a0596-7485-4376-9630-688753a7abd7" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.091008 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podUID="3d1a0596-7485-4376-9630-688753a7abd7" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.554693 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.555149 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_openshift-operators(860dc604-80d3-4d4b-8b1e-8a430b706882): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.556303 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podUID="860dc604-80d3-4d4b-8b1e-8a430b706882" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.567975 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.568138 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_openshift-operators(e2152c14-6da7-4f74-a30e-da9e4e7c1acc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 00:20:28 crc kubenswrapper[4713]: E0308 00:20:28.570434 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podUID="e2152c14-6da7-4f74-a30e-da9e4e7c1acc" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.603492 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745421 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745790 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.745860 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") pod \"d36584b2-9533-4c0e-807f-247e1dbfde71\" (UID: \"d36584b2-9533-4c0e-807f-247e1dbfde71\") " Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.746424 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities" (OuterVolumeSpecName: "utilities") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.752624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw" (OuterVolumeSpecName: "kube-api-access-8thgw") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "kube-api-access-8thgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.799548 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36584b2-9533-4c0e-807f-247e1dbfde71" (UID: "d36584b2-9533-4c0e-807f-247e1dbfde71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848233 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8thgw\" (UniqueName: \"kubernetes.io/projected/d36584b2-9533-4c0e-807f-247e1dbfde71-kube-api-access-8thgw\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848266 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.848280 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36584b2-9533-4c0e-807f-247e1dbfde71-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.865246 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-59b484cb78-hfzmx"] Mar 08 00:20:28 crc kubenswrapper[4713]: W0308 00:20:28.873963 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a74652_f05c_47a0_8caa_77f544c95128.slice/crio-0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e WatchSource:0}: Error finding container 0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e: Status 404 returned error can't find the container with id 0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e Mar 08 00:20:28 crc kubenswrapper[4713]: I0308 00:20:28.920935 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-qt8kz"] Mar 08 00:20:28 crc kubenswrapper[4713]: W0308 00:20:28.978880 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b64282_4957_4a04_b1be_6d3184bfdd25.slice/crio-1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed WatchSource:0}: Error finding container 1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed: Status 404 returned error can't find the container with id 1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.089808 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="e17c10baaae1bb1ca1dbac2e430fa8d136422f6db5c8355746dc1a5178fc022a" exitCode=0 Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.090109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"e17c10baaae1bb1ca1dbac2e430fa8d136422f6db5c8355746dc1a5178fc022a"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.092986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" event={"ID":"e5a74652-f05c-47a0-8caa-77f544c95128","Type":"ContainerStarted","Data":"0fad8a1842d72381e5e630976654126a4888cc95d05ca97fff7ef39fa249bc7e"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75hx9" event={"ID":"d36584b2-9533-4c0e-807f-247e1dbfde71","Type":"ContainerDied","Data":"bbb7c668e198fab933a09095559493804adf46dd60ac7836615cd7c4aef891ab"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095464 4713 scope.go:117] "RemoveContainer" containerID="44ec152dd3b5386afab48ac8b39a7d3e0f2f0d40c6f319d2c38fe0147e42cf11" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.095562 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75hx9" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.106737 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" event={"ID":"1f48c701-2464-42f6-b2d7-c851ae965f1b","Type":"ContainerStarted","Data":"84eb15f0c45fbb8d9e26d8f14a0b23a2410d8eb034e181470055fc7cad692b3e"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.116304 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" event={"ID":"f559f6d0-89dc-4d38-807f-491671408dc7","Type":"ContainerStarted","Data":"e3fb6bbf96dfb0a98be7f50a8fa5aef291423b0bdc000b4e2acd8e1d1996ef3d"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.123265 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.139447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" event={"ID":"37b64282-4957-4a04-b1be-6d3184bfdd25","Type":"ContainerStarted","Data":"1f8559bfcf41ce6c9de748ed8ebabaf9a5d4a53f68c1c21081f7413cdfa273ed"} Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.141891 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.145657 4713 scope.go:117] "RemoveContainer" containerID="203803ad97a614301bd797ddfaef477a72b58ad751b3d2f33a3a8397a7ce8390" Mar 08 00:20:29 crc kubenswrapper[4713]: E0308 00:20:29.145996 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podUID="e2152c14-6da7-4f74-a30e-da9e4e7c1acc" Mar 08 00:20:29 crc kubenswrapper[4713]: E0308 00:20:29.146138 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podUID="860dc604-80d3-4d4b-8b1e-8a430b706882" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.150445 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-75hx9"] Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.183310 4713 scope.go:117] "RemoveContainer" containerID="637411a4d2fb86d6c5126e6739d735ba75486124da7b040143ab3e4b7241f16f" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.188426 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.193924 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4z5hw" podStartSLOduration=2.135290849 podStartE2EDuration="19.193905875s" podCreationTimestamp="2026-03-08 00:20:10 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.518512217 +0000 UTC m=+865.638144450" lastFinishedPulling="2026-03-08 00:20:28.577127243 +0000 UTC m=+882.696759476" observedRunningTime="2026-03-08 00:20:29.159246786 +0000 UTC m=+883.278879019" watchObservedRunningTime="2026-03-08 00:20:29.193905875 +0000 UTC m=+883.313538108" Mar 08 00:20:29 crc kubenswrapper[4713]: I0308 00:20:29.194342 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-v4h4x" podStartSLOduration=1.526992467 podStartE2EDuration="18.194336947s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.936535595 +0000 UTC m=+866.056167828" lastFinishedPulling="2026-03-08 00:20:28.603880075 +0000 UTC m=+882.723512308" observedRunningTime="2026-03-08 00:20:29.192131969 +0000 UTC m=+883.311764202" watchObservedRunningTime="2026-03-08 00:20:29.194336947 +0000 UTC m=+883.313969180" Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.149411 4713 generic.go:334] "Generic (PLEG): container finished" podID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerID="2b71c90d3947e985d3c60cc0dd27d2933e68e61be980f33dcd93b7b2ed195658" exitCode=0 Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.150696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"2b71c90d3947e985d3c60cc0dd27d2933e68e61be980f33dcd93b7b2ed195658"} Mar 08 00:20:30 crc kubenswrapper[4713]: I0308 00:20:30.550291 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" path="/var/lib/kubelet/pods/d36584b2-9533-4c0e-807f-247e1dbfde71/volumes" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.650791 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810204 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810244 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.810267 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") pod \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\" (UID: \"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2\") " Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.811613 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle" (OuterVolumeSpecName: "bundle") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.831923 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546" (OuterVolumeSpecName: "kube-api-access-t2546") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "kube-api-access-t2546". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.837516 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util" (OuterVolumeSpecName: "util") pod "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" (UID: "f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911491 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911788 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:32 crc kubenswrapper[4713]: I0308 00:20:32.911798 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2546\" (UniqueName: \"kubernetes.io/projected/f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2-kube-api-access-t2546\") on node \"crc\" DevicePath \"\"" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174269 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" event={"ID":"f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2","Type":"ContainerDied","Data":"8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae"} Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174532 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ba57064076cfea14f3b28a190f2d539ac83115e86c3be26c27521876412cfae" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.174322 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.178272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" event={"ID":"e5a74652-f05c-47a0-8caa-77f544c95128","Type":"ContainerStarted","Data":"8dda70cd868de7fdd312496823ff4def57393e55badbe07eb7298f321989ba0d"} Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.199075 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-59b484cb78-hfzmx" podStartSLOduration=13.422078622 podStartE2EDuration="17.199055273s" podCreationTimestamp="2026-03-08 00:20:16 +0000 UTC" firstStartedPulling="2026-03-08 00:20:28.877415981 +0000 UTC m=+882.997048214" lastFinishedPulling="2026-03-08 00:20:32.654392632 +0000 UTC m=+886.774024865" observedRunningTime="2026-03-08 00:20:33.19589595 +0000 UTC m=+887.315528183" watchObservedRunningTime="2026-03-08 00:20:33.199055273 +0000 UTC m=+887.318687516" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973183 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973394 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973405 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973415 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-utilities" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973421 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-utilities" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973430 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973437 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973446 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="util" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973452 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="util" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973463 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="pull" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973470 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="pull" Mar 08 00:20:33 crc kubenswrapper[4713]: E0308 00:20:33.973478 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-content" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973483 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="extract-content" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973572 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36584b2-9533-4c0e-807f-247e1dbfde71" containerName="registry-server" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.973581 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2" containerName="extract" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.974301 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.976114 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-wwn4w" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980496 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980545 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980505 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.980511 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.981248 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.981305 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.982099 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.982295 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 08 00:20:33 crc kubenswrapper[4713]: I0308 00:20:33.994461 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135323 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135591 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135647 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135915 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.135947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.136019 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236759 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236967 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.236998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237021 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237225 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.237717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.240651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.240936 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.246454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.246873 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247281 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247367 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.247393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.250580 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/c8a16625-a3a9-4404-bf4a-073fc8f621b9-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.253274 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.253776 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.254042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/c8a16625-a3a9-4404-bf4a-073fc8f621b9-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"c8a16625-a3a9-4404-bf4a-073fc8f621b9\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:34 crc kubenswrapper[4713]: I0308 00:20:34.299137 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:20:38 crc kubenswrapper[4713]: I0308 00:20:38.930066 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:38 crc kubenswrapper[4713]: W0308 00:20:38.942983 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8a16625_a3a9_4404_bf4a_073fc8f621b9.slice/crio-48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4 WatchSource:0}: Error finding container 48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4: Status 404 returned error can't find the container with id 48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4 Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.223855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" event={"ID":"37b64282-4957-4a04-b1be-6d3184bfdd25","Type":"ContainerStarted","Data":"50b196c3829ced0df677ebe67124d31538db78b4f84c804a90e8deea5e0d9c5b"} Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.224993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"48ef7464e8f3092b00420033b31ae61f9f5d49db06a3aa75a85c9de43703e5d4"} Mar 08 00:20:39 crc kubenswrapper[4713]: I0308 00:20:39.238393 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-qt8kz" podStartSLOduration=9.471073908 podStartE2EDuration="19.238371073s" podCreationTimestamp="2026-03-08 00:20:20 +0000 UTC" firstStartedPulling="2026-03-08 00:20:28.982448707 +0000 UTC m=+883.102080940" lastFinishedPulling="2026-03-08 00:20:38.749745882 +0000 UTC m=+892.869378105" observedRunningTime="2026-03-08 00:20:39.237378427 +0000 UTC m=+893.357010680" watchObservedRunningTime="2026-03-08 00:20:39.238371073 +0000 UTC m=+893.358003306" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.262977 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" event={"ID":"3d1a0596-7485-4376-9630-688753a7abd7","Type":"ContainerStarted","Data":"75b05d212251dce244b0f532ea49ccbbbd16f64f697b14cfa9946e1746a8a25d"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.263772 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.264951 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" event={"ID":"860dc604-80d3-4d4b-8b1e-8a430b706882","Type":"ContainerStarted","Data":"740c8696ab496cee743dc6e46525ad94d869dbfa12fcc83ef80cc86a356fd848"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.266357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" event={"ID":"e2152c14-6da7-4f74-a30e-da9e4e7c1acc","Type":"ContainerStarted","Data":"338535d61378695b0d9a4695cbc8f430a2abbfb499530b1ae6060fd07be1faa0"} Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.279607 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" podStartSLOduration=2.065402395 podStartE2EDuration="33.279592184s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:12.092124148 +0000 UTC m=+866.211756381" lastFinishedPulling="2026-03-08 00:20:43.306313937 +0000 UTC m=+897.425946170" observedRunningTime="2026-03-08 00:20:44.2779225 +0000 UTC m=+898.397554753" watchObservedRunningTime="2026-03-08 00:20:44.279592184 +0000 UTC m=+898.399224417" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.296976 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5" podStartSLOduration=1.9139403910000001 podStartE2EDuration="33.29695833s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.923677158 +0000 UTC m=+866.043309391" lastFinishedPulling="2026-03-08 00:20:43.306695097 +0000 UTC m=+897.426327330" observedRunningTime="2026-03-08 00:20:44.296494228 +0000 UTC m=+898.416126481" watchObservedRunningTime="2026-03-08 00:20:44.29695833 +0000 UTC m=+898.416590563" Mar 08 00:20:44 crc kubenswrapper[4713]: I0308 00:20:44.314809 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk" podStartSLOduration=1.789947508 podStartE2EDuration="33.314790528s" podCreationTimestamp="2026-03-08 00:20:11 +0000 UTC" firstStartedPulling="2026-03-08 00:20:11.781523048 +0000 UTC m=+865.901155291" lastFinishedPulling="2026-03-08 00:20:43.306366078 +0000 UTC m=+897.425998311" observedRunningTime="2026-03-08 00:20:44.313045932 +0000 UTC m=+898.432678195" watchObservedRunningTime="2026-03-08 00:20:44.314790528 +0000 UTC m=+898.434422771" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.254234 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.255681 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.257748 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.257879 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wthx6" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.262051 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.278772 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.371463 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.371632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.473549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.473629 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.474182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.494545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2p5\" (UniqueName: \"kubernetes.io/projected/be3714bd-7a55-41dd-8a2f-1013ca3fff6a-kube-api-access-ts2p5\") pod \"cert-manager-operator-controller-manager-5586865c96-xl6tn\" (UID: \"be3714bd-7a55-41dd-8a2f-1013ca3fff6a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:49 crc kubenswrapper[4713]: I0308 00:20:49.576788 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" Mar 08 00:20:51 crc kubenswrapper[4713]: I0308 00:20:51.804548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tw72p" Mar 08 00:20:52 crc kubenswrapper[4713]: I0308 00:20:52.239204 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn"] Mar 08 00:20:52 crc kubenswrapper[4713]: W0308 00:20:52.252787 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3714bd_7a55_41dd_8a2f_1013ca3fff6a.slice/crio-5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264 WatchSource:0}: Error finding container 5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264: Status 404 returned error can't find the container with id 5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264 Mar 08 00:20:52 crc kubenswrapper[4713]: I0308 00:20:52.314593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" event={"ID":"be3714bd-7a55-41dd-8a2f-1013ca3fff6a","Type":"ContainerStarted","Data":"5ab1bcefea26c876d313f360a68e2c81feecce84f5a96dce82c79dbf53300264"} Mar 08 00:20:57 crc kubenswrapper[4713]: I0308 00:20:57.342922 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb"} Mar 08 00:20:58 crc kubenswrapper[4713]: I0308 00:20:58.755146 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:58 crc kubenswrapper[4713]: I0308 00:20:58.792948 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 08 00:20:59 crc kubenswrapper[4713]: I0308 00:20:59.563620 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerID="901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb" exitCode=0 Mar 08 00:20:59 crc kubenswrapper[4713]: I0308 00:20:59.563673 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerDied","Data":"901e49eb0ba165fd48834b3d45c1f58767321c8bc2ebd26b8af80bdbc98d5cbb"} Mar 08 00:21:00 crc kubenswrapper[4713]: I0308 00:21:00.572072 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerID="3a7dfc251b473197605f0fe9b0475cb4008cde89b99676c28a3f703bf1f74390" exitCode=0 Mar 08 00:21:00 crc kubenswrapper[4713]: I0308 00:21:00.572175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerDied","Data":"3a7dfc251b473197605f0fe9b0475cb4008cde89b99676c28a3f703bf1f74390"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.592627 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"c8a16625-a3a9-4404-bf4a-073fc8f621b9","Type":"ContainerStarted","Data":"074d80db31ea7158b61840a702a227c2c3efb176013854e00e182d894fe5c6c2"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.593255 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.595375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" event={"ID":"be3714bd-7a55-41dd-8a2f-1013ca3fff6a","Type":"ContainerStarted","Data":"d8664930ddd47f528e3611ce57e6e1187ebfc3923411e19dbebd455e67cae667"} Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.627160 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=15.44471751 podStartE2EDuration="28.62714155s" podCreationTimestamp="2026-03-08 00:20:33 +0000 UTC" firstStartedPulling="2026-03-08 00:20:38.948774935 +0000 UTC m=+893.068407168" lastFinishedPulling="2026-03-08 00:20:52.131198975 +0000 UTC m=+906.250831208" observedRunningTime="2026-03-08 00:21:01.624771328 +0000 UTC m=+915.744403581" watchObservedRunningTime="2026-03-08 00:21:01.62714155 +0000 UTC m=+915.746773803" Mar 08 00:21:01 crc kubenswrapper[4713]: I0308 00:21:01.655560 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-xl6tn" podStartSLOduration=3.828732017 podStartE2EDuration="12.655529725s" podCreationTimestamp="2026-03-08 00:20:49 +0000 UTC" firstStartedPulling="2026-03-08 00:20:52.254387377 +0000 UTC m=+906.374019610" lastFinishedPulling="2026-03-08 00:21:01.081185085 +0000 UTC m=+915.200817318" observedRunningTime="2026-03-08 00:21:01.64885778 +0000 UTC m=+915.768490013" watchObservedRunningTime="2026-03-08 00:21:01.655529725 +0000 UTC m=+915.775161968" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.451429 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.453658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.455634 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.455897 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.456995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.457074 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.478249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.546698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547071 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547214 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547974 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.547999 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649089 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649161 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649188 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649250 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649273 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649294 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649365 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649388 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.649730 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650163 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650381 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.650780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.651673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652144 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.652692 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.653067 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.659022 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.659408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.675479 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"service-telemetry-operator-1-build\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:06 crc kubenswrapper[4713]: I0308 00:21:06.778012 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.160234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.165459 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.480626 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.481398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.485673 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.486395 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.486589 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sb25l" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.498618 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.563387 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.563458 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.634128 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerStarted","Data":"2231a649a8f84f0f717170316c535043ecf640c8208085f92f8a7e585f35d9d1"} Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.664684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.664735 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.683349 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchml\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-kube-api-access-lchml\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.687355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a191145-c818-4e84-8bf3-91145fe9db03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-9mcfp\" (UID: \"1a191145-c818-4e84-8bf3-91145fe9db03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:07 crc kubenswrapper[4713]: I0308 00:21:07.799116 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" Mar 08 00:21:08 crc kubenswrapper[4713]: I0308 00:21:08.254101 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-9mcfp"] Mar 08 00:21:08 crc kubenswrapper[4713]: I0308 00:21:08.642632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" event={"ID":"1a191145-c818-4e84-8bf3-91145fe9db03","Type":"ContainerStarted","Data":"64a3c67f8f367e00538fbc4bcc945467433c77daaafe440ce0485464a1bbbf12"} Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.903729 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.904404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.907438 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pdqsn" Mar 08 00:21:10 crc kubenswrapper[4713]: I0308 00:21:10.916810 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.016727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.016790 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.117694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.117840 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.143810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.143880 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g78s\" (UniqueName: \"kubernetes.io/projected/2a071bf2-22e7-40f7-976a-74f79abbbd78-kube-api-access-6g78s\") pod \"cert-manager-webhook-6888856db4-qmcpl\" (UID: \"2a071bf2-22e7-40f7-976a-74f79abbbd78\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:11 crc kubenswrapper[4713]: I0308 00:21:11.230139 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:14 crc kubenswrapper[4713]: I0308 00:21:14.399112 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="c8a16625-a3a9-4404-bf4a-073fc8f621b9" containerName="elasticsearch" probeResult="failure" output=< Mar 08 00:21:14 crc kubenswrapper[4713]: {"timestamp": "2026-03-08T00:21:14+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 08 00:21:14 crc kubenswrapper[4713]: > Mar 08 00:21:16 crc kubenswrapper[4713]: I0308 00:21:16.654456 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.281867 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmcpl"] Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.702299 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" event={"ID":"2a071bf2-22e7-40f7-976a-74f79abbbd78","Type":"ContainerStarted","Data":"544dd268fbdb1341d1113d889e9e47f46a4986daed46e78cb9d79bc6f7fd4949"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.702375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" event={"ID":"2a071bf2-22e7-40f7-976a-74f79abbbd78","Type":"ContainerStarted","Data":"2aa4bbe1ec078ce07bf5c1541fe1d0938e383834a58549e3503b572fe162e4bb"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.704161 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.705965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" event={"ID":"1a191145-c818-4e84-8bf3-91145fe9db03","Type":"ContainerStarted","Data":"e09e670039fc04d892e0967e16bcd71371700f4ce93a08c3cbdf3ba7af290f46"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.710497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerStarted","Data":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.710652 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" containerID="cri-o://811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" gracePeriod=30 Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.725930 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" podStartSLOduration=7.725903971 podStartE2EDuration="7.725903971s" podCreationTimestamp="2026-03-08 00:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:17.724932316 +0000 UTC m=+931.844564549" watchObservedRunningTime="2026-03-08 00:21:17.725903971 +0000 UTC m=+931.845536214" Mar 08 00:21:17 crc kubenswrapper[4713]: I0308 00:21:17.748860 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-9mcfp" podStartSLOduration=2.169245771 podStartE2EDuration="10.748810202s" podCreationTimestamp="2026-03-08 00:21:07 +0000 UTC" firstStartedPulling="2026-03-08 00:21:08.268527358 +0000 UTC m=+922.388159591" lastFinishedPulling="2026-03-08 00:21:16.848091789 +0000 UTC m=+930.967724022" observedRunningTime="2026-03-08 00:21:17.745272879 +0000 UTC m=+931.864905132" watchObservedRunningTime="2026-03-08 00:21:17.748810202 +0000 UTC m=+931.868442475" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.108925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_829dcde5-b1d3-4479-875b-6275ec772c1d/manage-dockerfile/0.log" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.109317 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132685 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132732 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132799 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.132997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133047 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133072 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133101 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133167 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") pod \"829dcde5-b1d3-4479-875b-6275ec772c1d\" (UID: \"829dcde5-b1d3-4479-875b-6275ec772c1d\") " Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133422 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133554 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133567 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133577 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133809 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.133971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.134057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.134152 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.138663 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4" (OuterVolumeSpecName: "kube-api-access-4jtf4") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "kube-api-access-4jtf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.140069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.140993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "829dcde5-b1d3-4479-875b-6275ec772c1d" (UID: "829dcde5-b1d3-4479-875b-6275ec772c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234590 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234630 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234651 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234664 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234675 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/829dcde5-b1d3-4479-875b-6275ec772c1d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234685 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jtf4\" (UniqueName: \"kubernetes.io/projected/829dcde5-b1d3-4479-875b-6275ec772c1d-kube-api-access-4jtf4\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234695 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/829dcde5-b1d3-4479-875b-6275ec772c1d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234705 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/829dcde5-b1d3-4479-875b-6275ec772c1d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.234714 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/829dcde5-b1d3-4479-875b-6275ec772c1d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296336 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: E0308 00:21:18.296700 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296719 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.296865 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerName="manage-dockerfile" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.298007 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303345 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303350 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.303681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.326254 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372361 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372384 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372596 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372707 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372738 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.372898 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.373000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474357 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474383 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474434 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474474 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474530 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474578 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474604 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.474629 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.475624 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476363 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.476690 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477108 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477205 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.477749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.478712 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.483332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.483664 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.502563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"service-telemetry-operator-2-build\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.613082 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717096 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_829dcde5-b1d3-4479-875b-6275ec772c1d/manage-dockerfile/0.log" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717425 4713 generic.go:334] "Generic (PLEG): container finished" podID="829dcde5-b1d3-4479-875b-6275ec772c1d" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" exitCode=1 Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerDied","Data":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"829dcde5-b1d3-4479-875b-6275ec772c1d","Type":"ContainerDied","Data":"2231a649a8f84f0f717170316c535043ecf640c8208085f92f8a7e585f35d9d1"} Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717587 4713 scope.go:117] "RemoveContainer" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.717710 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.754549 4713 scope.go:117] "RemoveContainer" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.754648 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: E0308 00:21:18.755701 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": container with ID starting with 811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e not found: ID does not exist" containerID="811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.755741 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e"} err="failed to get container status \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": rpc error: code = NotFound desc = could not find container \"811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e\": container with ID starting with 811613ee504781645359b4f6bef1bb52d0ef4ab49569fbc78f67d20badcc5d0e not found: ID does not exist" Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.756912 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 08 00:21:18 crc kubenswrapper[4713]: I0308 00:21:18.873441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 08 00:21:19 crc kubenswrapper[4713]: I0308 00:21:19.725733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb"} Mar 08 00:21:19 crc kubenswrapper[4713]: I0308 00:21:19.726058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e"} Mar 08 00:21:20 crc kubenswrapper[4713]: I0308 00:21:20.298519 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 08 00:21:20 crc kubenswrapper[4713]: I0308 00:21:20.549015 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829dcde5-b1d3-4479-875b-6275ec772c1d" path="/var/lib/kubelet/pods/829dcde5-b1d3-4479-875b-6275ec772c1d/volumes" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.441597 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.442810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.447868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vqbzd" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.460459 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.548990 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.549078 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.650779 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.650953 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.671233 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-bound-sa-token\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.671694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74lj\" (UniqueName: \"kubernetes.io/projected/d4f51ae9-d2ab-4704-aeeb-5710aceda4f0-kube-api-access-d74lj\") pod \"cert-manager-545d4d4674-gkqzr\" (UID: \"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0\") " pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:24 crc kubenswrapper[4713]: I0308 00:21:24.758595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gkqzr" Mar 08 00:21:25 crc kubenswrapper[4713]: W0308 00:21:25.025108 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f51ae9_d2ab_4704_aeeb_5710aceda4f0.slice/crio-f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4 WatchSource:0}: Error finding container f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4: Status 404 returned error can't find the container with id f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4 Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.037512 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gkqzr"] Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.761936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gkqzr" event={"ID":"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0","Type":"ContainerStarted","Data":"c33fc1e38223c47754a1f717f6ccfef29fa174a7c0c22ce7c73011fe0b21b27a"} Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.761983 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gkqzr" event={"ID":"d4f51ae9-d2ab-4704-aeeb-5710aceda4f0","Type":"ContainerStarted","Data":"f17e527d2b15f43635f9fcd8d7d1916de66dae9b3a81c91d8fc931cd375c4ee4"} Mar 08 00:21:25 crc kubenswrapper[4713]: I0308 00:21:25.782055 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-gkqzr" podStartSLOduration=1.7820366779999999 podStartE2EDuration="1.782036678s" podCreationTimestamp="2026-03-08 00:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:25.778797673 +0000 UTC m=+939.898429916" watchObservedRunningTime="2026-03-08 00:21:25.782036678 +0000 UTC m=+939.901668901" Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.234305 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-qmcpl" Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.776586 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb" exitCode=0 Mar 08 00:21:26 crc kubenswrapper[4713]: I0308 00:21:26.776672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"31d5c12beb190c33e02771389167e0b587993d663944247e0b12745d5272bbcb"} Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.784209 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="3c984fc98d0cb19e91c6d652b313bc38124a2b4f356ef60376d103344ae061d8" exitCode=0 Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.784278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"3c984fc98d0cb19e91c6d652b313bc38124a2b4f356ef60376d103344ae061d8"} Mar 08 00:21:27 crc kubenswrapper[4713]: I0308 00:21:27.834591 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_1fc148b4-f954-4ef0-8c15-bbff85220029/manage-dockerfile/0.log" Mar 08 00:21:28 crc kubenswrapper[4713]: I0308 00:21:28.795377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerStarted","Data":"ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f"} Mar 08 00:21:28 crc kubenswrapper[4713]: I0308 00:21:28.822466 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=10.822436721999999 podStartE2EDuration="10.822436722s" podCreationTimestamp="2026-03-08 00:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:21:28.82008209 +0000 UTC m=+942.939714343" watchObservedRunningTime="2026-03-08 00:21:28.822436722 +0000 UTC m=+942.942068955" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.133426 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.135263 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.139823 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.140320 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.140437 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.146434 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.232981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.334536 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.361786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"auto-csr-approver-29548822-zwqb8\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.456787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:00 crc kubenswrapper[4713]: I0308 00:22:00.668425 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:22:01 crc kubenswrapper[4713]: I0308 00:22:01.045171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerStarted","Data":"354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10"} Mar 08 00:22:02 crc kubenswrapper[4713]: I0308 00:22:02.052180 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerStarted","Data":"03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125"} Mar 08 00:22:02 crc kubenswrapper[4713]: I0308 00:22:02.065512 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" podStartSLOduration=1.065513952 podStartE2EDuration="2.065491307s" podCreationTimestamp="2026-03-08 00:22:00 +0000 UTC" firstStartedPulling="2026-03-08 00:22:00.671811121 +0000 UTC m=+974.791443354" lastFinishedPulling="2026-03-08 00:22:01.671788446 +0000 UTC m=+975.791420709" observedRunningTime="2026-03-08 00:22:02.064753998 +0000 UTC m=+976.184386231" watchObservedRunningTime="2026-03-08 00:22:02.065491307 +0000 UTC m=+976.185123540" Mar 08 00:22:03 crc kubenswrapper[4713]: I0308 00:22:03.062270 4713 generic.go:334] "Generic (PLEG): container finished" podID="985fdd12-7009-419a-8098-df4c84849d22" containerID="03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125" exitCode=0 Mar 08 00:22:03 crc kubenswrapper[4713]: I0308 00:22:03.062318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerDied","Data":"03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125"} Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.322328 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.400506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") pod \"985fdd12-7009-419a-8098-df4c84849d22\" (UID: \"985fdd12-7009-419a-8098-df4c84849d22\") " Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.405793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz" (OuterVolumeSpecName: "kube-api-access-qwdlz") pod "985fdd12-7009-419a-8098-df4c84849d22" (UID: "985fdd12-7009-419a-8098-df4c84849d22"). InnerVolumeSpecName "kube-api-access-qwdlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:04 crc kubenswrapper[4713]: I0308 00:22:04.502430 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwdlz\" (UniqueName: \"kubernetes.io/projected/985fdd12-7009-419a-8098-df4c84849d22-kube-api-access-qwdlz\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" event={"ID":"985fdd12-7009-419a-8098-df4c84849d22","Type":"ContainerDied","Data":"354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10"} Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083071 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="354ae8922f628bfcf3a4b66f5eb2f2a9e6348f730d75c4ba294e9425c8c90d10" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.083143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548822-zwqb8" Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.125525 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:22:05 crc kubenswrapper[4713]: I0308 00:22:05.133844 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548816-gtsk5"] Mar 08 00:22:06 crc kubenswrapper[4713]: I0308 00:22:06.550971 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4623866-795f-438d-9b3b-66afb30f9657" path="/var/lib/kubelet/pods/e4623866-795f-438d-9b3b-66afb30f9657/volumes" Mar 08 00:22:14 crc kubenswrapper[4713]: I0308 00:22:14.877416 4713 scope.go:117] "RemoveContainer" containerID="88536119c11c7644e16e9556af63bc5f387d89253eeaf6cbd55a1eddd526755e" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.126662 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:28 crc kubenswrapper[4713]: E0308 00:22:28.127516 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.127530 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.127648 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="985fdd12-7009-419a-8098-df4c84849d22" containerName="oc" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.128473 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.142368 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.250698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.250944 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.251238 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352246 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352308 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352369 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.352927 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.383027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"community-operators-9zwsx\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.488280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:28 crc kubenswrapper[4713]: I0308 00:22:28.790242 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:29 crc kubenswrapper[4713]: I0308 00:22:29.252268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerStarted","Data":"4f79f8d113b33a8ed31c712e6bc24fadf44173317f45c47a3aecbb3f986bd86c"} Mar 08 00:22:30 crc kubenswrapper[4713]: I0308 00:22:30.265384 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" exitCode=0 Mar 08 00:22:30 crc kubenswrapper[4713]: I0308 00:22:30.265550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957"} Mar 08 00:22:32 crc kubenswrapper[4713]: I0308 00:22:32.284294 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" exitCode=0 Mar 08 00:22:32 crc kubenswrapper[4713]: I0308 00:22:32.284378 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce"} Mar 08 00:22:33 crc kubenswrapper[4713]: I0308 00:22:33.293883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerStarted","Data":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} Mar 08 00:22:33 crc kubenswrapper[4713]: I0308 00:22:33.317256 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zwsx" podStartSLOduration=2.92886965 podStartE2EDuration="5.317235581s" podCreationTimestamp="2026-03-08 00:22:28 +0000 UTC" firstStartedPulling="2026-03-08 00:22:30.269670132 +0000 UTC m=+1004.389302365" lastFinishedPulling="2026-03-08 00:22:32.658036063 +0000 UTC m=+1006.777668296" observedRunningTime="2026-03-08 00:22:33.316794559 +0000 UTC m=+1007.436426802" watchObservedRunningTime="2026-03-08 00:22:33.317235581 +0000 UTC m=+1007.436867814" Mar 08 00:22:34 crc kubenswrapper[4713]: I0308 00:22:34.501590 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:22:34 crc kubenswrapper[4713]: I0308 00:22:34.501656 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.488603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.489236 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:38 crc kubenswrapper[4713]: I0308 00:22:38.553032 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:39 crc kubenswrapper[4713]: I0308 00:22:39.381301 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:39 crc kubenswrapper[4713]: I0308 00:22:39.418247 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:41 crc kubenswrapper[4713]: I0308 00:22:41.337894 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zwsx" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" containerID="cri-o://92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" gracePeriod=2 Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.153422 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199201 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199264 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.199297 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") pod \"e6a28e60-a4ea-42bc-baaf-d90f095194db\" (UID: \"e6a28e60-a4ea-42bc-baaf-d90f095194db\") " Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.200410 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities" (OuterVolumeSpecName: "utilities") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.205082 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn" (OuterVolumeSpecName: "kube-api-access-tt5wn") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "kube-api-access-tt5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.254805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a28e60-a4ea-42bc-baaf-d90f095194db" (UID: "e6a28e60-a4ea-42bc-baaf-d90f095194db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300612 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300656 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5wn\" (UniqueName: \"kubernetes.io/projected/e6a28e60-a4ea-42bc-baaf-d90f095194db-kube-api-access-tt5wn\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.300669 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a28e60-a4ea-42bc-baaf-d90f095194db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364909 4713 generic.go:334] "Generic (PLEG): container finished" podID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" exitCode=0 Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364948 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.364962 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zwsx" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.365059 4713 scope.go:117] "RemoveContainer" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.366554 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zwsx" event={"ID":"e6a28e60-a4ea-42bc-baaf-d90f095194db","Type":"ContainerDied","Data":"4f79f8d113b33a8ed31c712e6bc24fadf44173317f45c47a3aecbb3f986bd86c"} Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.379924 4713 scope.go:117] "RemoveContainer" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.393919 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.395899 4713 scope.go:117] "RemoveContainer" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.403156 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zwsx"] Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.415382 4713 scope.go:117] "RemoveContainer" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416031 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": container with ID starting with 92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108 not found: ID does not exist" containerID="92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416083 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108"} err="failed to get container status \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": rpc error: code = NotFound desc = could not find container \"92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108\": container with ID starting with 92c72a89a107928ed4e894fdab18396694847871d111ea13884b19906757f108 not found: ID does not exist" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416114 4713 scope.go:117] "RemoveContainer" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416406 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": container with ID starting with 86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce not found: ID does not exist" containerID="86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416440 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce"} err="failed to get container status \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": rpc error: code = NotFound desc = could not find container \"86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce\": container with ID starting with 86a8568ad4d8d26eb934bfa1de549b2580203db400351de12095645eb85258ce not found: ID does not exist" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416458 4713 scope.go:117] "RemoveContainer" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: E0308 00:22:45.416695 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": container with ID starting with 7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957 not found: ID does not exist" containerID="7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957" Mar 08 00:22:45 crc kubenswrapper[4713]: I0308 00:22:45.416720 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957"} err="failed to get container status \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": rpc error: code = NotFound desc = could not find container \"7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957\": container with ID starting with 7f9113f3ab2d0e883be9558722f019c38fe9e8388dad6e267f07a8b9c81b4957 not found: ID does not exist" Mar 08 00:22:46 crc kubenswrapper[4713]: I0308 00:22:46.549669 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" path="/var/lib/kubelet/pods/e6a28e60-a4ea-42bc-baaf-d90f095194db/volumes" Mar 08 00:23:04 crc kubenswrapper[4713]: I0308 00:23:04.500927 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:23:04 crc kubenswrapper[4713]: I0308 00:23:04.501418 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:23:09 crc kubenswrapper[4713]: I0308 00:23:09.512755 4713 generic.go:334] "Generic (PLEG): container finished" podID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerID="ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f" exitCode=0 Mar 08 00:23:09 crc kubenswrapper[4713]: I0308 00:23:09.512850 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"ee6703f14aab020c6c6eebf428313e07ac09472749abcb07ec3ca3caf3e5ca7f"} Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.761187 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838613 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838661 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838710 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838734 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838785 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838817 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838865 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838886 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.838910 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") pod \"1fc148b4-f954-4ef0-8c15-bbff85220029\" (UID: \"1fc148b4-f954-4ef0-8c15-bbff85220029\") " Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839682 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839791 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839965 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.839950 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.840378 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.840853 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.845995 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.846024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.846049 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd" (OuterVolumeSpecName: "kube-api-access-dclxd") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "kube-api-access-dclxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.875260 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.940915 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941199 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941329 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941389 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1fc148b4-f954-4ef0-8c15-bbff85220029-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941447 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941502 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941555 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941612 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dclxd\" (UniqueName: \"kubernetes.io/projected/1fc148b4-f954-4ef0-8c15-bbff85220029-kube-api-access-dclxd\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941666 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fc148b4-f954-4ef0-8c15-bbff85220029-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:10 crc kubenswrapper[4713]: I0308 00:23:10.941745 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fc148b4-f954-4ef0-8c15-bbff85220029-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.019233 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.042700 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528041 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"1fc148b4-f954-4ef0-8c15-bbff85220029","Type":"ContainerDied","Data":"66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e"} Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528082 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66baa3590517de49b5509c8015457716863174fefd4cceac80014e6ff5386a9e" Mar 08 00:23:11 crc kubenswrapper[4713]: I0308 00:23:11.528144 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 08 00:23:12 crc kubenswrapper[4713]: I0308 00:23:12.595815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1fc148b4-f954-4ef0-8c15-bbff85220029" (UID: "1fc148b4-f954-4ef0-8c15-bbff85220029"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:12 crc kubenswrapper[4713]: I0308 00:23:12.663805 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fc148b4-f954-4ef0-8c15-bbff85220029-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088633 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-utilities" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088645 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-utilities" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088653 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088659 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088672 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="manage-dockerfile" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088678 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="manage-dockerfile" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088694 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-content" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088699 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="extract-content" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088707 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="git-clone" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="git-clone" Mar 08 00:23:15 crc kubenswrapper[4713]: E0308 00:23:15.088720 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088727 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088855 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a28e60-a4ea-42bc-baaf-d90f095194db" containerName="registry-server" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.088868 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc148b4-f954-4ef0-8c15-bbff85220029" containerName="docker-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.089489 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.091869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092094 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092382 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.092468 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.106477 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197406 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197607 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197775 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.197926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198038 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198107 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198305 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198380 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.198411 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300220 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300427 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300467 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.300781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301064 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301238 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.301688 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.302293 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.309015 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.309045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.321307 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"smart-gateway-operator-1-build\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:15 crc kubenswrapper[4713]: I0308 00:23:15.405359 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:16 crc kubenswrapper[4713]: I0308 00:23:16.555166 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:16 crc kubenswrapper[4713]: I0308 00:23:16.573915 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerStarted","Data":"786d94fda50c47e27ddd447590d235cdd4682da4924ec66b3fd625d8e492e3f4"} Mar 08 00:23:17 crc kubenswrapper[4713]: I0308 00:23:17.583367 4713 generic.go:334] "Generic (PLEG): container finished" podID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" exitCode=0 Mar 08 00:23:17 crc kubenswrapper[4713]: I0308 00:23:17.583491 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001"} Mar 08 00:23:18 crc kubenswrapper[4713]: I0308 00:23:18.591897 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerStarted","Data":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} Mar 08 00:23:18 crc kubenswrapper[4713]: I0308 00:23:18.616009 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.615991084 podStartE2EDuration="3.615991084s" podCreationTimestamp="2026-03-08 00:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:23:18.610812406 +0000 UTC m=+1052.730444659" watchObservedRunningTime="2026-03-08 00:23:18.615991084 +0000 UTC m=+1052.735623317" Mar 08 00:23:25 crc kubenswrapper[4713]: I0308 00:23:25.514391 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:25 crc kubenswrapper[4713]: I0308 00:23:25.515496 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" containerID="cri-o://22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" gracePeriod=30 Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.183469 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.185674 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.187452 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.188344 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.189952 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.213752 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.274973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275147 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275220 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275277 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275356 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.275408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376583 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376603 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.376718 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377613 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377847 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.377975 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378101 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.378723 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.390746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.391192 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.398492 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"smart-gateway-operator-2-build\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.502189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:23:27 crc kubenswrapper[4713]: I0308 00:23:27.903915 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 08 00:23:27 crc kubenswrapper[4713]: W0308 00:23:27.914255 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod036ba45b_c97e_4ac4_a537_373dfa81f0de.slice/crio-99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7 WatchSource:0}: Error finding container 99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7: Status 404 returned error can't find the container with id 99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7 Mar 08 00:23:28 crc kubenswrapper[4713]: I0308 00:23:28.659710 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.618669 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_28cea654-fd65-41d0-a3bf-74641ad0990c/docker-build/0.log" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.619268 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.667443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.668752 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_28cea654-fd65-41d0-a3bf-74641ad0990c/docker-build/0.log" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669134 4713 generic.go:334] "Generic (PLEG): container finished" podID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" exitCode=1 Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669163 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"28cea654-fd65-41d0-a3bf-74641ad0990c","Type":"ContainerDied","Data":"786d94fda50c47e27ddd447590d235cdd4682da4924ec66b3fd625d8e492e3f4"} Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669208 4713 scope.go:117] "RemoveContainer" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.669166 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707603 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707782 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.707969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708038 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708071 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708091 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708097 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708731 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708994 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.708157 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709184 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") pod \"28cea654-fd65-41d0-a3bf-74641ad0990c\" (UID: \"28cea654-fd65-41d0-a3bf-74641ad0990c\") " Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.709073 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710006 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710033 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710045 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710054 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28cea654-fd65-41d0-a3bf-74641ad0990c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710066 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.710076 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cea654-fd65-41d0-a3bf-74641ad0990c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.711369 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.713401 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.713461 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.714096 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm" (OuterVolumeSpecName: "kube-api-access-knkjm") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "kube-api-access-knkjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.751535 4713 scope.go:117] "RemoveContainer" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783081 4713 scope.go:117] "RemoveContainer" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: E0308 00:23:29.783618 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": container with ID starting with 22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341 not found: ID does not exist" containerID="22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783654 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341"} err="failed to get container status \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": rpc error: code = NotFound desc = could not find container \"22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341\": container with ID starting with 22472e2e88e22e821408b69b900850c0f4cfe857988a1ba718b2fad073f95341 not found: ID does not exist" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.783675 4713 scope.go:117] "RemoveContainer" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: E0308 00:23:29.784843 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": container with ID starting with 94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001 not found: ID does not exist" containerID="94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.784920 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001"} err="failed to get container status \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": rpc error: code = NotFound desc = could not find container \"94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001\": container with ID starting with 94cfdb707f1c9f93a207c75667296b22830471b2b16f9d2a90c008cbb7c58001 not found: ID does not exist" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811425 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811461 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/28cea654-fd65-41d0-a3bf-74641ad0990c-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811472 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.811485 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knkjm\" (UniqueName: \"kubernetes.io/projected/28cea654-fd65-41d0-a3bf-74641ad0990c-kube-api-access-knkjm\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.957542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:29 crc kubenswrapper[4713]: I0308 00:23:29.994125 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "28cea654-fd65-41d0-a3bf-74641ad0990c" (UID: "28cea654-fd65-41d0-a3bf-74641ad0990c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.014623 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.014666 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/28cea654-fd65-41d0-a3bf-74641ad0990c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.304802 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.310829 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.549587 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" path="/var/lib/kubelet/pods/28cea654-fd65-41d0-a3bf-74641ad0990c/volumes" Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.678256 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a" exitCode=0 Mar 08 00:23:30 crc kubenswrapper[4713]: I0308 00:23:30.678317 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"70059d4912f6673006f3786721d68ea839c745cba2342d836a8a02bc5cd3016a"} Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.687408 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="df57be6898528c792d3da245f48f36ebd1e922776e89da3fc00040bd8ac76e19" exitCode=0 Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.687453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"df57be6898528c792d3da245f48f36ebd1e922776e89da3fc00040bd8ac76e19"} Mar 08 00:23:31 crc kubenswrapper[4713]: I0308 00:23:31.730657 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_036ba45b-c97e-4ac4-a537-373dfa81f0de/manage-dockerfile/0.log" Mar 08 00:23:32 crc kubenswrapper[4713]: I0308 00:23:32.698064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerStarted","Data":"e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab"} Mar 08 00:23:32 crc kubenswrapper[4713]: I0308 00:23:32.741945 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.741921315 podStartE2EDuration="5.741921315s" podCreationTimestamp="2026-03-08 00:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:23:32.735252257 +0000 UTC m=+1066.854884500" watchObservedRunningTime="2026-03-08 00:23:32.741921315 +0000 UTC m=+1066.861553548" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501171 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501775 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.501879 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.502944 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.503027 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" gracePeriod=600 Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713918 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" exitCode=0 Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713956 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0"} Mar 08 00:23:34 crc kubenswrapper[4713]: I0308 00:23:34.713986 4713 scope.go:117] "RemoveContainer" containerID="3f58d2453dfb0789e4b6de1707b22e49490c850b97fdf881933aaed3e3ea5cb4" Mar 08 00:23:35 crc kubenswrapper[4713]: I0308 00:23:35.722124 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.132361 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: E0308 00:24:00.133098 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="manage-dockerfile" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133110 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="manage-dockerfile" Mar 08 00:24:00 crc kubenswrapper[4713]: E0308 00:24:00.133120 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133126 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133235 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cea654-fd65-41d0-a3bf-74641ad0990c" containerName="docker-build" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.133606 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136612 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136683 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.136798 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.140404 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.213654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.315041 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.333739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"auto-csr-approver-29548824-mrbjn\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.461725 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.861175 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:24:00 crc kubenswrapper[4713]: I0308 00:24:00.883444 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerStarted","Data":"3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e"} Mar 08 00:24:03 crc kubenswrapper[4713]: I0308 00:24:03.903492 4713 generic.go:334] "Generic (PLEG): container finished" podID="42829204-3911-4926-bcab-0e8f7b731986" containerID="5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4" exitCode=0 Mar 08 00:24:03 crc kubenswrapper[4713]: I0308 00:24:03.903568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerDied","Data":"5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4"} Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.140355 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.277388 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") pod \"42829204-3911-4926-bcab-0e8f7b731986\" (UID: \"42829204-3911-4926-bcab-0e8f7b731986\") " Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.284065 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4" (OuterVolumeSpecName: "kube-api-access-94wj4") pod "42829204-3911-4926-bcab-0e8f7b731986" (UID: "42829204-3911-4926-bcab-0e8f7b731986"). InnerVolumeSpecName "kube-api-access-94wj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.378790 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94wj4\" (UniqueName: \"kubernetes.io/projected/42829204-3911-4926-bcab-0e8f7b731986-kube-api-access-94wj4\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" event={"ID":"42829204-3911-4926-bcab-0e8f7b731986","Type":"ContainerDied","Data":"3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e"} Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922477 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d892d14bb1c80986170cb8cd73af5739315d3b18f77257ddc15c638af4a621e" Mar 08 00:24:05 crc kubenswrapper[4713]: I0308 00:24:05.922168 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548824-mrbjn" Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.213545 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.218463 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548818-c92cn"] Mar 08 00:24:06 crc kubenswrapper[4713]: I0308 00:24:06.548534 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf256d4-02b4-46fd-86a1-793e34a17bf5" path="/var/lib/kubelet/pods/bbf256d4-02b4-46fd-86a1-793e34a17bf5/volumes" Mar 08 00:24:14 crc kubenswrapper[4713]: I0308 00:24:14.964464 4713 scope.go:117] "RemoveContainer" containerID="0f83288064679e56b151b6696b75672f2d4637476a38071e252b04509b88078f" Mar 08 00:24:51 crc kubenswrapper[4713]: I0308 00:24:51.223848 4713 generic.go:334] "Generic (PLEG): container finished" podID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerID="e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab" exitCode=0 Mar 08 00:24:51 crc kubenswrapper[4713]: I0308 00:24:51.223898 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"e91504847d35f8027e57aadd536ccdea7215ff603a6a8dc70b2cb358b3c880ab"} Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.474815 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617628 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617680 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617709 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617729 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617778 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617931 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.617996 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618040 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") pod \"036ba45b-c97e-4ac4-a537-373dfa81f0de\" (UID: \"036ba45b-c97e-4ac4-a537-373dfa81f0de\") " Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618578 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.618971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.619478 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.621974 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.622360 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.622713 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.623187 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.623808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.624606 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.626260 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds" (OuterVolumeSpecName: "kube-api-access-fftds") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "kube-api-access-fftds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719472 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719497 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719507 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fftds\" (UniqueName: \"kubernetes.io/projected/036ba45b-c97e-4ac4-a537-373dfa81f0de-kube-api-access-fftds\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719515 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719525 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719534 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719542 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719550 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/036ba45b-c97e-4ac4-a537-373dfa81f0de-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.719559 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/036ba45b-c97e-4ac4-a537-373dfa81f0de-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.814964 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:52 crc kubenswrapper[4713]: I0308 00:24:52.820437 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"036ba45b-c97e-4ac4-a537-373dfa81f0de","Type":"ContainerDied","Data":"99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7"} Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237396 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e67191fc98c3ca6b2a46bb30dbdf3d717dba25a505f668389d0a3c46fc65b7" Mar 08 00:24:53 crc kubenswrapper[4713]: I0308 00:24:53.237419 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 08 00:24:54 crc kubenswrapper[4713]: I0308 00:24:54.375461 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "036ba45b-c97e-4ac4-a537-373dfa81f0de" (UID: "036ba45b-c97e-4ac4-a537-373dfa81f0de"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:24:54 crc kubenswrapper[4713]: I0308 00:24:54.444951 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/036ba45b-c97e-4ac4-a537-373dfa81f0de-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.526130 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.527970 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528067 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528145 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="manage-dockerfile" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528239 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="manage-dockerfile" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528312 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="git-clone" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="git-clone" Mar 08 00:24:56 crc kubenswrapper[4713]: E0308 00:24:56.528471 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528544 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528738 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="42829204-3911-4926-bcab-0e8f7b731986" containerName="oc" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.528854 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="036ba45b-c97e-4ac4-a537-373dfa81f0de" containerName="docker-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.529673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.531747 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532085 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532329 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.532638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.549747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675218 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675238 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675260 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675288 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675306 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675328 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675351 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.675386 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.777120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778233 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778346 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778504 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778569 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778751 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778848 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778450 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778917 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778925 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.778971 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.779084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.779681 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.784457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.794100 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.798197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"sg-core-1-build\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " pod="service-telemetry/sg-core-1-build" Mar 08 00:24:56 crc kubenswrapper[4713]: I0308 00:24:56.851013 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:24:57 crc kubenswrapper[4713]: I0308 00:24:57.066399 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:24:57 crc kubenswrapper[4713]: I0308 00:24:57.262257 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerStarted","Data":"af27dda306199e60ac82c42d73b874c62495ade449cd1c0b0121c417c647d7e1"} Mar 08 00:24:59 crc kubenswrapper[4713]: I0308 00:24:59.277970 4713 generic.go:334] "Generic (PLEG): container finished" podID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerID="820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d" exitCode=0 Mar 08 00:24:59 crc kubenswrapper[4713]: I0308 00:24:59.278067 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d"} Mar 08 00:25:00 crc kubenswrapper[4713]: I0308 00:25:00.287590 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerStarted","Data":"73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76"} Mar 08 00:25:00 crc kubenswrapper[4713]: I0308 00:25:00.313106 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.313088232 podStartE2EDuration="4.313088232s" podCreationTimestamp="2026-03-08 00:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:00.31188527 +0000 UTC m=+1154.431517503" watchObservedRunningTime="2026-03-08 00:25:00.313088232 +0000 UTC m=+1154.432720465" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.064856 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.066623 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" containerID="cri-o://73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" gracePeriod=30 Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328461 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328870 4713 generic.go:334] "Generic (PLEG): container finished" podID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerID="73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" exitCode=1 Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.328916 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76"} Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.956492 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:07 crc kubenswrapper[4713]: I0308 00:25:07.958280 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125452 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125514 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125563 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125627 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125687 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125722 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125747 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125775 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.125848 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") pod \"75b6be2f-9bac-4c3b-94b5-7a063d891561\" (UID: \"75b6be2f-9bac-4c3b-94b5-7a063d891561\") " Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.126181 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.126387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.127008 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.127083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128064 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128073 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.128485 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132025 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132291 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.132864 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx" (OuterVolumeSpecName: "kube-api-access-pndlx") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "kube-api-access-pndlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.211083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227668 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227704 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227715 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227725 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227735 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227744 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227752 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/75b6be2f-9bac-4c3b-94b5-7a063d891561-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227763 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227774 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/75b6be2f-9bac-4c3b-94b5-7a063d891561-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227785 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pndlx\" (UniqueName: \"kubernetes.io/projected/75b6be2f-9bac-4c3b-94b5-7a063d891561-kube-api-access-pndlx\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.227795 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/75b6be2f-9bac-4c3b-94b5-7a063d891561-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.245024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "75b6be2f-9bac-4c3b-94b5-7a063d891561" (UID: "75b6be2f-9bac-4c3b-94b5-7a063d891561"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.328524 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/75b6be2f-9bac-4c3b-94b5-7a063d891561-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336362 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_75b6be2f-9bac-4c3b-94b5-7a063d891561/docker-build/0.log" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336890 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"75b6be2f-9bac-4c3b-94b5-7a063d891561","Type":"ContainerDied","Data":"af27dda306199e60ac82c42d73b874c62495ade449cd1c0b0121c417c647d7e1"} Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336929 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.336938 4713 scope.go:117] "RemoveContainer" containerID="73f917ff900abac7775688508758d5acc8574a07df218ff492d7d488ac8aea76" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.372844 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.378003 4713 scope.go:117] "RemoveContainer" containerID="820cbae741a6e2d5638bbd708e83fc7ff3413d84da68f379ee5daacc65d0210d" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.381589 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.548160 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" path="/var/lib/kubelet/pods/75b6be2f-9bac-4c3b-94b5-7a063d891561/volumes" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.697866 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: E0308 00:25:08.698161 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698174 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: E0308 00:25:08.698187 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="manage-dockerfile" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698194 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="manage-dockerfile" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.698312 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b6be2f-9bac-4c3b-94b5-7a063d891561" containerName="docker-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.699112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.702508 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.702511 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.704371 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.704493 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.713068 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835616 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835706 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835721 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835842 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.835891 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936946 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.936983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937028 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937135 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937272 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.937897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.938331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.938382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.939322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.942423 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.942462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:08 crc kubenswrapper[4713]: I0308 00:25:08.954556 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"sg-core-2-build\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " pod="service-telemetry/sg-core-2-build" Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.029673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.210653 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 08 00:25:09 crc kubenswrapper[4713]: I0308 00:25:09.344596 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4"} Mar 08 00:25:10 crc kubenswrapper[4713]: I0308 00:25:10.351990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895"} Mar 08 00:25:11 crc kubenswrapper[4713]: I0308 00:25:11.363967 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895" exitCode=0 Mar 08 00:25:11 crc kubenswrapper[4713]: I0308 00:25:11.364011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"0d8e9e92cf71cf9a49208f26bbb668a5ab48cd9e4a37c2c1942070005899b895"} Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.373956 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="10805c8581330d572333818f1f8b595a89a5246c39de1a0d940c7497db5c499f" exitCode=0 Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.374046 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"10805c8581330d572333818f1f8b595a89a5246c39de1a0d940c7497db5c499f"} Mar 08 00:25:12 crc kubenswrapper[4713]: I0308 00:25:12.407706 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_b950bb15-0796-4aa8-9920-6c0d3dd622e7/manage-dockerfile/0.log" Mar 08 00:25:13 crc kubenswrapper[4713]: I0308 00:25:13.381996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerStarted","Data":"9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7"} Mar 08 00:25:13 crc kubenswrapper[4713]: I0308 00:25:13.413546 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.413519053 podStartE2EDuration="5.413519053s" podCreationTimestamp="2026-03-08 00:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:25:13.405595695 +0000 UTC m=+1167.525227948" watchObservedRunningTime="2026-03-08 00:25:13.413519053 +0000 UTC m=+1167.533151296" Mar 08 00:25:34 crc kubenswrapper[4713]: I0308 00:25:34.501293 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:25:34 crc kubenswrapper[4713]: I0308 00:25:34.501911 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.137897 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.139607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142235 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142754 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.142916 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.146344 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.286211 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.387138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.406770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"auto-csr-approver-29548826-fhk5r\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.458055 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:00 crc kubenswrapper[4713]: I0308 00:26:00.872431 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:26:01 crc kubenswrapper[4713]: I0308 00:26:01.667853 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerStarted","Data":"badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6"} Mar 08 00:26:03 crc kubenswrapper[4713]: I0308 00:26:03.683924 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerStarted","Data":"76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c"} Mar 08 00:26:03 crc kubenswrapper[4713]: I0308 00:26:03.701123 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" podStartSLOduration=1.178882876 podStartE2EDuration="3.70110945s" podCreationTimestamp="2026-03-08 00:26:00 +0000 UTC" firstStartedPulling="2026-03-08 00:26:00.884555115 +0000 UTC m=+1215.004187348" lastFinishedPulling="2026-03-08 00:26:03.406781689 +0000 UTC m=+1217.526413922" observedRunningTime="2026-03-08 00:26:03.699452936 +0000 UTC m=+1217.819085189" watchObservedRunningTime="2026-03-08 00:26:03.70110945 +0000 UTC m=+1217.820741683" Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.501110 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.501169 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.691370 4713 generic.go:334] "Generic (PLEG): container finished" podID="45fc1987-0bdc-476c-9315-18ddbf570461" containerID="76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c" exitCode=0 Mar 08 00:26:04 crc kubenswrapper[4713]: I0308 00:26:04.691434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerDied","Data":"76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c"} Mar 08 00:26:05 crc kubenswrapper[4713]: I0308 00:26:05.983219 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.068059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") pod \"45fc1987-0bdc-476c-9315-18ddbf570461\" (UID: \"45fc1987-0bdc-476c-9315-18ddbf570461\") " Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.076031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq" (OuterVolumeSpecName: "kube-api-access-zf7cq") pod "45fc1987-0bdc-476c-9315-18ddbf570461" (UID: "45fc1987-0bdc-476c-9315-18ddbf570461"). InnerVolumeSpecName "kube-api-access-zf7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.169972 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf7cq\" (UniqueName: \"kubernetes.io/projected/45fc1987-0bdc-476c-9315-18ddbf570461-kube-api-access-zf7cq\") on node \"crc\" DevicePath \"\"" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704783 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" event={"ID":"45fc1987-0bdc-476c-9315-18ddbf570461","Type":"ContainerDied","Data":"badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6"} Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704845 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="badce5250d1b5ad4223d4d020e98203d0342b8010e59163b7be0bc706789e8d6" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.704854 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548826-fhk5r" Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.750763 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:26:06 crc kubenswrapper[4713]: I0308 00:26:06.757760 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548820-cts7b"] Mar 08 00:26:08 crc kubenswrapper[4713]: I0308 00:26:08.549562 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c62a3d3-0f8a-40d6-a2f0-b860e9c85085" path="/var/lib/kubelet/pods/8c62a3d3-0f8a-40d6-a2f0-b860e9c85085/volumes" Mar 08 00:26:15 crc kubenswrapper[4713]: I0308 00:26:15.058972 4713 scope.go:117] "RemoveContainer" containerID="f841e6785162901f02d099ef1f13977229ba672ec5a1c4b87a1f7c3c310267fe" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501063 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501633 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.501683 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.502299 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.502367 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" gracePeriod=600 Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.886748 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" exitCode=0 Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.886852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f"} Mar 08 00:26:34 crc kubenswrapper[4713]: I0308 00:26:34.887591 4713 scope.go:117] "RemoveContainer" containerID="c05ee6e5a19168a6d6242d209054a09db1bc72634110e6c102d8134908c2acc0" Mar 08 00:26:35 crc kubenswrapper[4713]: I0308 00:26:35.897010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.137704 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: E0308 00:28:00.138657 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.138679 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.138862 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" containerName="oc" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.139365 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.143964 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.144065 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.144243 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.146395 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.312910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.414160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.448607 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"auto-csr-approver-29548828-b8fft\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.467453 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.886485 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:28:00 crc kubenswrapper[4713]: W0308 00:28:00.891793 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f9ab32_0c71_4b60_b499_75b2f4f4dcf3.slice/crio-39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21 WatchSource:0}: Error finding container 39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21: Status 404 returned error can't find the container with id 39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21 Mar 08 00:28:00 crc kubenswrapper[4713]: I0308 00:28:00.894436 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:28:01 crc kubenswrapper[4713]: I0308 00:28:01.611164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerStarted","Data":"39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21"} Mar 08 00:28:06 crc kubenswrapper[4713]: I0308 00:28:06.642611 4713 generic.go:334] "Generic (PLEG): container finished" podID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerID="ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71" exitCode=0 Mar 08 00:28:06 crc kubenswrapper[4713]: I0308 00:28:06.642655 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerDied","Data":"ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71"} Mar 08 00:28:07 crc kubenswrapper[4713]: I0308 00:28:07.844886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.013305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") pod \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\" (UID: \"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3\") " Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.019362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg" (OuterVolumeSpecName: "kube-api-access-rlxdg") pod "91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" (UID: "91f9ab32-0c71-4b60-b499-75b2f4f4dcf3"). InnerVolumeSpecName "kube-api-access-rlxdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.115282 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlxdg\" (UniqueName: \"kubernetes.io/projected/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3-kube-api-access-rlxdg\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548828-b8fft" event={"ID":"91f9ab32-0c71-4b60-b499-75b2f4f4dcf3","Type":"ContainerDied","Data":"39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21"} Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656094 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39dae9c861d36dec5f3f5bee86abcca2160152f47b8648d1c06815228f4b0d21" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.656132 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548828-b8fft" Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.897418 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:28:08 crc kubenswrapper[4713]: I0308 00:28:08.901758 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548822-zwqb8"] Mar 08 00:28:10 crc kubenswrapper[4713]: I0308 00:28:10.555196 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985fdd12-7009-419a-8098-df4c84849d22" path="/var/lib/kubelet/pods/985fdd12-7009-419a-8098-df4c84849d22/volumes" Mar 08 00:28:15 crc kubenswrapper[4713]: I0308 00:28:15.128955 4713 scope.go:117] "RemoveContainer" containerID="03f2240ea47d4e1505d29677bf54b0934fc0985bf6c6ce2acf97701158af0125" Mar 08 00:28:29 crc kubenswrapper[4713]: I0308 00:28:29.781350 4713 generic.go:334] "Generic (PLEG): container finished" podID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerID="9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7" exitCode=0 Mar 08 00:28:29 crc kubenswrapper[4713]: I0308 00:28:29.781431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"9c2d53d2e25840ed0d4868a439ef4aa09e614db37d156b21c523365ff053b5e7"} Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.022648 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100664 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100797 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100893 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.100968 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101000 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101060 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101100 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") pod \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\" (UID: \"b950bb15-0796-4aa8-9920-6c0d3dd622e7\") " Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101377 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101489 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101556 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.101975 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.102205 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.102541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.105963 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.106270 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.106590 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72" (OuterVolumeSpecName: "kube-api-access-wwq72") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "kube-api-access-wwq72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.113198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201946 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201984 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.201996 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202007 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202018 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b950bb15-0796-4aa8-9920-6c0d3dd622e7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202029 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202040 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202052 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwq72\" (UniqueName: \"kubernetes.io/projected/b950bb15-0796-4aa8-9920-6c0d3dd622e7-kube-api-access-wwq72\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.202062 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/b950bb15-0796-4aa8-9920-6c0d3dd622e7-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.446680 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.505329 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796588 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"b950bb15-0796-4aa8-9920-6c0d3dd622e7","Type":"ContainerDied","Data":"545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4"} Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796630 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545332d882267752a7fc0c2268f7c2474d414930cabb6e9b981fa928a8e47be4" Mar 08 00:28:31 crc kubenswrapper[4713]: I0308 00:28:31.796683 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 08 00:28:33 crc kubenswrapper[4713]: I0308 00:28:33.579927 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b950bb15-0796-4aa8-9920-6c0d3dd622e7" (UID: "b950bb15-0796-4aa8-9920-6c0d3dd622e7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:33 crc kubenswrapper[4713]: I0308 00:28:33.631660 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b950bb15-0796-4aa8-9920-6c0d3dd622e7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.527418 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529125 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="manage-dockerfile" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529239 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="manage-dockerfile" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529326 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529403 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529482 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="git-clone" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529552 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="git-clone" Mar 08 00:28:35 crc kubenswrapper[4713]: E0308 00:28:35.529637 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529714 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.529954 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b950bb15-0796-4aa8-9920-6c0d3dd622e7" containerName="docker-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.530088 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" containerName="oc" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.531326 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.533702 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534123 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534185 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.534482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.544454 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563085 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563143 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563218 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563310 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563330 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563413 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.563504 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665145 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665186 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665210 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665266 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665359 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665374 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.665553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666220 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666271 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666326 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.666711 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.671599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.671700 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.691445 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"sg-bridge-1-build\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:35 crc kubenswrapper[4713]: I0308 00:28:35.845542 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.240743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.826696 4713 generic.go:334] "Generic (PLEG): container finished" podID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerID="01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451" exitCode=0 Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.826766 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451"} Mar 08 00:28:36 crc kubenswrapper[4713]: I0308 00:28:36.827034 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerStarted","Data":"6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432"} Mar 08 00:28:37 crc kubenswrapper[4713]: I0308 00:28:37.835438 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerStarted","Data":"0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef"} Mar 08 00:28:37 crc kubenswrapper[4713]: I0308 00:28:37.859342 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.8593250770000003 podStartE2EDuration="2.859325077s" podCreationTimestamp="2026-03-08 00:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:28:37.855754373 +0000 UTC m=+1371.975386606" watchObservedRunningTime="2026-03-08 00:28:37.859325077 +0000 UTC m=+1371.978957310" Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.871747 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.872693 4713 generic.go:334] "Generic (PLEG): container finished" podID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerID="0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef" exitCode=1 Mar 08 00:28:43 crc kubenswrapper[4713]: I0308 00:28:43.872740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef"} Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.114805 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.115512 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297271 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297315 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297342 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297407 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297436 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297475 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297515 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297541 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297590 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") pod \"e709cdbe-6c8e-4853-85f3-453fc41a930d\" (UID: \"e709cdbe-6c8e-4853-85f3-453fc41a930d\") " Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.297815 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298267 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.298396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.299188 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.299354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303660 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp" (OuterVolumeSpecName: "kube-api-access-szthp") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "kube-api-access-szthp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303796 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.303847 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.360498 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399192 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399223 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399232 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399239 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399250 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399259 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399270 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399277 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e709cdbe-6c8e-4853-85f3-453fc41a930d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399285 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/e709cdbe-6c8e-4853-85f3-453fc41a930d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.399293 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szthp\" (UniqueName: \"kubernetes.io/projected/e709cdbe-6c8e-4853-85f3-453fc41a930d-kube-api-access-szthp\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.655856 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e709cdbe-6c8e-4853-85f3-453fc41a930d" (UID: "e709cdbe-6c8e-4853-85f3-453fc41a930d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.703047 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e709cdbe-6c8e-4853-85f3-453fc41a930d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891175 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_e709cdbe-6c8e-4853-85f3-453fc41a930d/docker-build/0.log" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"e709cdbe-6c8e-4853-85f3-453fc41a930d","Type":"ContainerDied","Data":"6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432"} Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891641 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa4b25cf897f6651cfcca2cf0d7068ae5c2ea57809dec6519deb3bd9cef0432" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.891702 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.938679 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:45 crc kubenswrapper[4713]: I0308 00:28:45.946379 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 08 00:28:46 crc kubenswrapper[4713]: I0308 00:28:46.552662 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" path="/var/lib/kubelet/pods/e709cdbe-6c8e-4853-85f3-453fc41a930d/volumes" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.544980 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:47 crc kubenswrapper[4713]: E0308 00:28:47.545232 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="manage-dockerfile" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545247 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="manage-dockerfile" Mar 08 00:28:47 crc kubenswrapper[4713]: E0308 00:28:47.545269 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545278 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.545420 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e709cdbe-6c8e-4853-85f3-453fc41a930d" containerName="docker-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.546213 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.547833 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.547882 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.548688 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.549465 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.567103 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625843 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625897 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.625977 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626046 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626185 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626254 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626329 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626378 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.626426 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727418 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727439 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727459 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727502 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727530 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727563 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727590 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.727879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.728893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729013 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729057 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.729059 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.732305 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.737621 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.744758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"sg-bridge-2-build\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:47 crc kubenswrapper[4713]: I0308 00:28:47.860441 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.095131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.913976 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992"} Mar 08 00:28:48 crc kubenswrapper[4713]: I0308 00:28:48.914232 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14"} Mar 08 00:28:49 crc kubenswrapper[4713]: I0308 00:28:49.928201 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992" exitCode=0 Mar 08 00:28:49 crc kubenswrapper[4713]: I0308 00:28:49.928271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"e8197eadd8b22b6c38affe2ac83c08099c716aa9ad8e4e06b66822d8ef99c992"} Mar 08 00:28:50 crc kubenswrapper[4713]: I0308 00:28:50.937664 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="4c71b1a2140085bec3748281dfbee0833851e2d135c6b9e429fa875adb54d2c5" exitCode=0 Mar 08 00:28:50 crc kubenswrapper[4713]: I0308 00:28:50.937713 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"4c71b1a2140085bec3748281dfbee0833851e2d135c6b9e429fa875adb54d2c5"} Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.001885 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_1c4738b4-e463-4bb9-a2dc-0a7861232c1d/manage-dockerfile/0.log" Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.946966 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerStarted","Data":"a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2"} Mar 08 00:28:51 crc kubenswrapper[4713]: I0308 00:28:51.976787 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.976761364 podStartE2EDuration="4.976761364s" podCreationTimestamp="2026-03-08 00:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:28:51.971559427 +0000 UTC m=+1386.091191700" watchObservedRunningTime="2026-03-08 00:28:51.976761364 +0000 UTC m=+1386.096393637" Mar 08 00:29:04 crc kubenswrapper[4713]: I0308 00:29:04.500387 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:29:04 crc kubenswrapper[4713]: I0308 00:29:04.500925 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:29:32 crc kubenswrapper[4713]: I0308 00:29:32.220970 4713 generic.go:334] "Generic (PLEG): container finished" podID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerID="a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2" exitCode=0 Mar 08 00:29:32 crc kubenswrapper[4713]: I0308 00:29:32.221009 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"a5f742a87c3757dc2da796fc52b28b2a5f71ef0f553420d2b30529401d1853b2"} Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.569379 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725952 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.725991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726057 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726078 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726126 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726170 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726191 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726230 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726253 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") pod \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\" (UID: \"1c4738b4-e463-4bb9-a2dc-0a7861232c1d\") " Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726325 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726643 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726659 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.726973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727076 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.727607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.729349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.733078 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55" (OuterVolumeSpecName: "kube-api-access-85p55") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "kube-api-access-85p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.733514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.734083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827514 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85p55\" (UniqueName: \"kubernetes.io/projected/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-kube-api-access-85p55\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827550 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827560 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827569 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827581 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827595 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827603 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.827612 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.862332 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:33 crc kubenswrapper[4713]: I0308 00:29:33.929268 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.239887 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.239879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"1c4738b4-e463-4bb9-a2dc-0a7861232c1d","Type":"ContainerDied","Data":"a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14"} Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.240037 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2520b9b3a4b16e4135496c66c365e09d95199d867e2488dcaa113a1fb909a14" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.442541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1c4738b4-e463-4bb9-a2dc-0a7861232c1d" (UID: "1c4738b4-e463-4bb9-a2dc-0a7861232c1d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.501462 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.501526 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:29:34 crc kubenswrapper[4713]: I0308 00:29:34.538200 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c4738b4-e463-4bb9-a2dc-0a7861232c1d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.299492 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300072 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="git-clone" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300088 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="git-clone" Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300105 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="manage-dockerfile" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300114 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="manage-dockerfile" Mar 08 00:29:37 crc kubenswrapper[4713]: E0308 00:29:37.300127 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300136 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.300274 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4738b4-e463-4bb9-a2dc-0a7861232c1d" containerName="docker-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.301018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305251 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305322 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305489 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.305779 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.314908 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476681 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476779 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476875 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476907 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.476954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.477035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.477057 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578179 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578244 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578279 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578502 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578651 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578738 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.578962 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579160 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579429 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579606 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579720 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.579813 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.586420 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.587368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.599035 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:37 crc kubenswrapper[4713]: I0308 00:29:37.614258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:38 crc kubenswrapper[4713]: I0308 00:29:38.009109 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:38 crc kubenswrapper[4713]: I0308 00:29:38.272422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerStarted","Data":"7d3540bca41e1d46698b63f4f058eb4474315d57b0b021937080565a780badb3"} Mar 08 00:29:39 crc kubenswrapper[4713]: I0308 00:29:39.278885 4713 generic.go:334] "Generic (PLEG): container finished" podID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerID="75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78" exitCode=0 Mar 08 00:29:39 crc kubenswrapper[4713]: I0308 00:29:39.278938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78"} Mar 08 00:29:40 crc kubenswrapper[4713]: I0308 00:29:40.288450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerStarted","Data":"2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc"} Mar 08 00:29:40 crc kubenswrapper[4713]: I0308 00:29:40.316293 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.316258436 podStartE2EDuration="3.316258436s" podCreationTimestamp="2026-03-08 00:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:29:40.310408282 +0000 UTC m=+1434.430040515" watchObservedRunningTime="2026-03-08 00:29:40.316258436 +0000 UTC m=+1434.435890689" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.024751 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.025577 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" containerID="cri-o://2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" gracePeriod=30 Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342042 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342764 4713 generic.go:334] "Generic (PLEG): container finished" podID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerID="2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" exitCode=1 Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.342803 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc"} Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.380000 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.380591 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521314 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521419 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521465 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.521707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522239 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522438 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522615 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522765 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.522814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") pod \"88dd7370-e036-44f4-906c-a03f3798ee7f\" (UID: \"88dd7370-e036-44f4-906c-a03f3798ee7f\") " Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523228 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523247 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523260 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88dd7370-e036-44f4-906c-a03f3798ee7f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.523780 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.524538 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527487 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6" (OuterVolumeSpecName: "kube-api-access-vwpf6") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "kube-api-access-vwpf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.527863 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.591359 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624320 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624353 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624363 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624373 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwpf6\" (UniqueName: \"kubernetes.io/projected/88dd7370-e036-44f4-906c-a03f3798ee7f-kube-api-access-vwpf6\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624383 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624392 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624403 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88dd7370-e036-44f4-906c-a03f3798ee7f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.624414 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88dd7370-e036-44f4-906c-a03f3798ee7f-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.883678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "88dd7370-e036-44f4-906c-a03f3798ee7f" (UID: "88dd7370-e036-44f4-906c-a03f3798ee7f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:29:48 crc kubenswrapper[4713]: I0308 00:29:48.928545 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88dd7370-e036-44f4-906c-a03f3798ee7f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.351231 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_88dd7370-e036-44f4-906c-a03f3798ee7f/docker-build/0.log" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352309 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"88dd7370-e036-44f4-906c-a03f3798ee7f","Type":"ContainerDied","Data":"7d3540bca41e1d46698b63f4f058eb4474315d57b0b021937080565a780badb3"} Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352353 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.352386 4713 scope.go:117] "RemoveContainer" containerID="2eff2ac31edbd43932db6de20f228de707d0a6a6b091aefa358db6b0a6ac4bbc" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.386548 4713 scope.go:117] "RemoveContainer" containerID="75f326f596c8074d0e0004f8348e7fb30d0d25afd989dc3fd48ceff0a95f0e78" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.404930 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.409448 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674297 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: E0308 00:29:49.674624 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674645 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: E0308 00:29:49.674670 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="manage-dockerfile" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674683 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="manage-dockerfile" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.674930 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" containerName="docker-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.676788 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.679551 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.680682 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.680720 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.681036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.690238 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839128 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839348 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839429 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839495 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839555 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.839895 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840201 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.840302 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941347 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941410 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941528 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941565 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941618 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941643 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.941712 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942035 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942269 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942727 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.942932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.943023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.948529 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.948673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:49 crc kubenswrapper[4713]: I0308 00:29:49.958644 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.028550 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.462051 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 08 00:29:50 crc kubenswrapper[4713]: I0308 00:29:50.551846 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dd7370-e036-44f4-906c-a03f3798ee7f" path="/var/lib/kubelet/pods/88dd7370-e036-44f4-906c-a03f3798ee7f/volumes" Mar 08 00:29:51 crc kubenswrapper[4713]: I0308 00:29:51.372919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8"} Mar 08 00:29:51 crc kubenswrapper[4713]: I0308 00:29:51.373264 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90"} Mar 08 00:29:52 crc kubenswrapper[4713]: I0308 00:29:52.382391 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8" exitCode=0 Mar 08 00:29:52 crc kubenswrapper[4713]: I0308 00:29:52.382432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"370db0d532375790e272a8831b8b06a7bffa1a6206bbbefdccfe8ec69ece8ac8"} Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.389906 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="769ad33b4491cfff20c31acfa1bd44ca25f93ee045dbe8cee23176ab67a67457" exitCode=0 Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.390019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"769ad33b4491cfff20c31acfa1bd44ca25f93ee045dbe8cee23176ab67a67457"} Mar 08 00:29:53 crc kubenswrapper[4713]: I0308 00:29:53.431580 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_88b0640d-1c8b-4309-bce2-011f21f4578c/manage-dockerfile/0.log" Mar 08 00:29:54 crc kubenswrapper[4713]: I0308 00:29:54.402244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerStarted","Data":"ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d"} Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.128522 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=11.128502931 podStartE2EDuration="11.128502931s" podCreationTimestamp="2026-03-08 00:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:29:54.438041078 +0000 UTC m=+1448.557673321" watchObservedRunningTime="2026-03-08 00:30:00.128502931 +0000 UTC m=+1454.248135164" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.134111 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.135014 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.138327 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.138905 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.141722 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.143070 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.236403 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.237496 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.240379 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.244150 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.248380 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.272409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373784 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.373959 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.374000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.397717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"auto-csr-approver-29548830-csc8c\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.452425 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475323 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.475366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.476333 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.479418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.494563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"collect-profiles-29548830-rntpn\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.554214 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.826691 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:30:00 crc kubenswrapper[4713]: I0308 00:30:00.935960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn"] Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462138 4713 generic.go:334] "Generic (PLEG): container finished" podID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerID="d4b4e5eefafcfeaee5c9f5d40fb853163f9b52c4141752f9abd458d295d15c7b" exitCode=0 Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerDied","Data":"d4b4e5eefafcfeaee5c9f5d40fb853163f9b52c4141752f9abd458d295d15c7b"} Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.462508 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerStarted","Data":"a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693"} Mar 08 00:30:01 crc kubenswrapper[4713]: I0308 00:30:01.463611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerStarted","Data":"5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316"} Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.480655 4713 generic.go:334] "Generic (PLEG): container finished" podID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerID="5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879" exitCode=0 Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.480721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerDied","Data":"5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879"} Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.761531 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.904470 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") pod \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\" (UID: \"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a\") " Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.905278 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.910003 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:02 crc kubenswrapper[4713]: I0308 00:30:02.910043 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb" (OuterVolumeSpecName: "kube-api-access-vx6pb") pod "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" (UID: "2bcfc109-be57-4b72-a9a2-7a7a735bbd1a"). InnerVolumeSpecName "kube-api-access-vx6pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005413 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005449 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6pb\" (UniqueName: \"kubernetes.io/projected/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-kube-api-access-vx6pb\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.005460 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2bcfc109-be57-4b72-a9a2-7a7a735bbd1a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488069 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" event={"ID":"2bcfc109-be57-4b72-a9a2-7a7a735bbd1a","Type":"ContainerDied","Data":"a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693"} Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488417 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58d30d9343e00de15613956067fd9d24207e80c80ac1f4f3bf1ed2d3a133693" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.488084 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29548830-rntpn" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.736839 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.917171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") pod \"2b849b06-281c-44be-a061-ca5b3905b3e1\" (UID: \"2b849b06-281c-44be-a061-ca5b3905b3e1\") " Mar 08 00:30:03 crc kubenswrapper[4713]: I0308 00:30:03.922541 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn" (OuterVolumeSpecName: "kube-api-access-z5hvn") pod "2b849b06-281c-44be-a061-ca5b3905b3e1" (UID: "2b849b06-281c-44be-a061-ca5b3905b3e1"). InnerVolumeSpecName "kube-api-access-z5hvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.018849 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5hvn\" (UniqueName: \"kubernetes.io/projected/2b849b06-281c-44be-a061-ca5b3905b3e1-kube-api-access-z5hvn\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548830-csc8c" event={"ID":"2b849b06-281c-44be-a061-ca5b3905b3e1","Type":"ContainerDied","Data":"5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316"} Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495800 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be7eb95db2318c93c52ad0aca0a0a0290dd335215e1095f65cf0bd15245c316" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.495883 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548830-csc8c" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500286 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500362 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.500431 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.501106 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.501175 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" gracePeriod=600 Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.826973 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:30:04 crc kubenswrapper[4713]: I0308 00:30:04.832247 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548824-mrbjn"] Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504283 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" exitCode=0 Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c"} Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} Mar 08 00:30:05 crc kubenswrapper[4713]: I0308 00:30:05.504609 4713 scope.go:117] "RemoveContainer" containerID="c9719f0bfb278b285d17679470509ae6172a8ecfd762a13c6a85c14fdaf89f7f" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.357661 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:06 crc kubenswrapper[4713]: E0308 00:30:06.358046 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358072 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: E0308 00:30:06.358103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358114 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358249 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcfc109-be57-4b72-a9a2-7a7a735bbd1a" containerName="collect-profiles" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.358275 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" containerName="oc" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.359471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.374968 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.548716 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42829204-3911-4926-bcab-0e8f7b731986" path="/var/lib/kubelet/pods/42829204-3911-4926-bcab-0e8f7b731986/volumes" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.552712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.552792 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.554412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.655478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.656073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.656090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.685679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"redhat-operators-69sgm\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:06 crc kubenswrapper[4713]: I0308 00:30:06.975869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.224184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.532842 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e"} Mar 08 00:30:07 crc kubenswrapper[4713]: I0308 00:30:07.532897 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"c62b23f71ee51e106272541d3a540821e4dd4bad864cddd3f75035f7ae8459dd"} Mar 08 00:30:08 crc kubenswrapper[4713]: I0308 00:30:08.540995 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e" exitCode=0 Mar 08 00:30:08 crc kubenswrapper[4713]: I0308 00:30:08.551071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e"} Mar 08 00:30:09 crc kubenswrapper[4713]: I0308 00:30:09.550027 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4"} Mar 08 00:30:10 crc kubenswrapper[4713]: I0308 00:30:10.557406 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4" exitCode=0 Mar 08 00:30:10 crc kubenswrapper[4713]: I0308 00:30:10.557586 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4"} Mar 08 00:30:11 crc kubenswrapper[4713]: I0308 00:30:11.564671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerStarted","Data":"0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c"} Mar 08 00:30:11 crc kubenswrapper[4713]: I0308 00:30:11.582077 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-69sgm" podStartSLOduration=3.054591415 podStartE2EDuration="5.58206015s" podCreationTimestamp="2026-03-08 00:30:06 +0000 UTC" firstStartedPulling="2026-03-08 00:30:08.543057799 +0000 UTC m=+1462.662690032" lastFinishedPulling="2026-03-08 00:30:11.070526544 +0000 UTC m=+1465.190158767" observedRunningTime="2026-03-08 00:30:11.580711555 +0000 UTC m=+1465.700343798" watchObservedRunningTime="2026-03-08 00:30:11.58206015 +0000 UTC m=+1465.701692383" Mar 08 00:30:15 crc kubenswrapper[4713]: I0308 00:30:15.200132 4713 scope.go:117] "RemoveContainer" containerID="5194adfd055d923428c5bad5d8993dba160fbbc540dca7c2cc8ef69daad1dbf4" Mar 08 00:30:16 crc kubenswrapper[4713]: I0308 00:30:16.976913 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:16 crc kubenswrapper[4713]: I0308 00:30:16.977197 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:18 crc kubenswrapper[4713]: I0308 00:30:18.036303 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-69sgm" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" probeResult="failure" output=< Mar 08 00:30:18 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:30:18 crc kubenswrapper[4713]: > Mar 08 00:30:27 crc kubenswrapper[4713]: I0308 00:30:27.019452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:27 crc kubenswrapper[4713]: I0308 00:30:27.059116 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:30 crc kubenswrapper[4713]: I0308 00:30:30.638202 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:30 crc kubenswrapper[4713]: I0308 00:30:30.639107 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-69sgm" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" containerID="cri-o://0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" gracePeriod=2 Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.096102 4713 generic.go:334] "Generic (PLEG): container finished" podID="957eea24-22ac-426b-abe9-996fdf130d19" containerID="0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" exitCode=0 Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.096142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c"} Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.503721 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.608596 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") pod \"957eea24-22ac-426b-abe9-996fdf130d19\" (UID: \"957eea24-22ac-426b-abe9-996fdf130d19\") " Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.609766 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities" (OuterVolumeSpecName: "utilities") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.615517 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5" (OuterVolumeSpecName: "kube-api-access-7tgt5") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "kube-api-access-7tgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.710664 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.710710 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tgt5\" (UniqueName: \"kubernetes.io/projected/957eea24-22ac-426b-abe9-996fdf130d19-kube-api-access-7tgt5\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.730301 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "957eea24-22ac-426b-abe9-996fdf130d19" (UID: "957eea24-22ac-426b-abe9-996fdf130d19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:31 crc kubenswrapper[4713]: I0308 00:30:31.812189 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/957eea24-22ac-426b-abe9-996fdf130d19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105290 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69sgm" event={"ID":"957eea24-22ac-426b-abe9-996fdf130d19","Type":"ContainerDied","Data":"c62b23f71ee51e106272541d3a540821e4dd4bad864cddd3f75035f7ae8459dd"} Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105339 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69sgm" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.105379 4713 scope.go:117] "RemoveContainer" containerID="0b9ff58ca69afc612cfb0165be5630dc5ee51a065fa1714961148c4481c78a9c" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.132023 4713 scope.go:117] "RemoveContainer" containerID="6970978d105d04a32e067fed848c317ebfbf71c6d52fbb18b171f28aed0508e4" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.145888 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.150151 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-69sgm"] Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.179233 4713 scope.go:117] "RemoveContainer" containerID="6ee1793a7046481e3b27dc2ceab643fe6b040c613a76144c47e73cc34a82299e" Mar 08 00:30:32 crc kubenswrapper[4713]: I0308 00:30:32.547916 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957eea24-22ac-426b-abe9-996fdf130d19" path="/var/lib/kubelet/pods/957eea24-22ac-426b-abe9-996fdf130d19/volumes" Mar 08 00:30:45 crc kubenswrapper[4713]: I0308 00:30:45.193416 4713 generic.go:334] "Generic (PLEG): container finished" podID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerID="ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d" exitCode=0 Mar 08 00:30:45 crc kubenswrapper[4713]: I0308 00:30:45.193501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"ac8230c8632760ddb3ac19a198cfc4522ce2a67e22d1b6707a6f5ecde314ae5d"} Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.447351 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602264 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.602905 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603796 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603962 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.603644 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604014 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604209 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604656 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604739 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.604946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605038 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") pod \"88b0640d-1c8b-4309-bce2-011f21f4578c\" (UID: \"88b0640d-1c8b-4309-bce2-011f21f4578c\") " Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605319 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605380 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605443 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605497 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605602 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/88b0640d-1c8b-4309-bce2-011f21f4578c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.605983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.606104 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.607493 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.608376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.610198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb" (OuterVolumeSpecName: "kube-api-access-s8flb") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "kube-api-access-s8flb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.706757 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707333 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707405 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8flb\" (UniqueName: \"kubernetes.io/projected/88b0640d-1c8b-4309-bce2-011f21f4578c-kube-api-access-s8flb\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707469 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/88b0640d-1c8b-4309-bce2-011f21f4578c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.707528 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/88b0640d-1c8b-4309-bce2-011f21f4578c-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.713620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:46 crc kubenswrapper[4713]: I0308 00:30:46.808991 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"88b0640d-1c8b-4309-bce2-011f21f4578c","Type":"ContainerDied","Data":"8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90"} Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210608 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce83e4ab9056a34d36195fcc4e1477a5d85172933c94b6817b335d367e82a90" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.210634 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.453137 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "88b0640d-1c8b-4309-bce2-011f21f4578c" (UID: "88b0640d-1c8b-4309-bce2-011f21f4578c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.498627 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499193 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="manage-dockerfile" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499221 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="manage-dockerfile" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499236 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-utilities" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499244 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-utilities" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499273 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-content" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499279 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="extract-content" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499294 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499305 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="git-clone" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499313 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="git-clone" Mar 08 00:30:47 crc kubenswrapper[4713]: E0308 00:30:47.499354 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499361 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499555 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b0640d-1c8b-4309-bce2-011f21f4578c" containerName="docker-build" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.499569 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="957eea24-22ac-426b-abe9-996fdf130d19" containerName="registry-server" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.500616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.511729 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521170 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.521871 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/88b0640d-1c8b-4309-bce2-011f21f4578c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.624275 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.625694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.625807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.626210 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.626297 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.651364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"certified-operators-ntck8\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:47 crc kubenswrapper[4713]: I0308 00:30:47.819405 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:48 crc kubenswrapper[4713]: I0308 00:30:48.114920 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:48 crc kubenswrapper[4713]: I0308 00:30:48.234244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerStarted","Data":"cb867c50000d8b99afc6f684e5f719dee646322428c1a08e8dabaed01ce9d7d2"} Mar 08 00:30:49 crc kubenswrapper[4713]: I0308 00:30:49.242210 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" exitCode=0 Mar 08 00:30:49 crc kubenswrapper[4713]: I0308 00:30:49.242413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4"} Mar 08 00:30:50 crc kubenswrapper[4713]: I0308 00:30:50.251571 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" exitCode=0 Mar 08 00:30:50 crc kubenswrapper[4713]: I0308 00:30:50.251695 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff"} Mar 08 00:30:51 crc kubenswrapper[4713]: I0308 00:30:51.259942 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerStarted","Data":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} Mar 08 00:30:51 crc kubenswrapper[4713]: I0308 00:30:51.305636 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntck8" podStartSLOduration=2.93329394 podStartE2EDuration="4.30561594s" podCreationTimestamp="2026-03-08 00:30:47 +0000 UTC" firstStartedPulling="2026-03-08 00:30:49.244570836 +0000 UTC m=+1503.364203069" lastFinishedPulling="2026-03-08 00:30:50.616892836 +0000 UTC m=+1504.736525069" observedRunningTime="2026-03-08 00:30:51.302683232 +0000 UTC m=+1505.422315465" watchObservedRunningTime="2026-03-08 00:30:51.30561594 +0000 UTC m=+1505.425248173" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.441177 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.442596 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.444368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.444768 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.445350 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.445363 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.461749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526969 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.526991 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527034 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527153 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527269 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527363 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527446 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.527487 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628698 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628745 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628770 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628789 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628883 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628950 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.628972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629070 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629919 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.629938 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630016 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.630401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.634653 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.638194 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.646033 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.757210 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:30:55 crc kubenswrapper[4713]: I0308 00:30:55.984982 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:30:55 crc kubenswrapper[4713]: W0308 00:30:55.988978 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3c490af_d8bd_4659_b51d_6aec80c439c8.slice/crio-faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900 WatchSource:0}: Error finding container faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900: Status 404 returned error can't find the container with id faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900 Mar 08 00:30:56 crc kubenswrapper[4713]: I0308 00:30:56.297105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6"} Mar 08 00:30:56 crc kubenswrapper[4713]: I0308 00:30:56.297194 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900"} Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.304319 4713 generic.go:334] "Generic (PLEG): container finished" podID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerID="8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6" exitCode=0 Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.304364 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6"} Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.820086 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.820456 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:57 crc kubenswrapper[4713]: I0308 00:30:57.868306 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.311992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerStarted","Data":"fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3"} Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.351655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.368773 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=3.368726661 podStartE2EDuration="3.368726661s" podCreationTimestamp="2026-03-08 00:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:30:58.336141596 +0000 UTC m=+1512.455773849" watchObservedRunningTime="2026-03-08 00:30:58.368726661 +0000 UTC m=+1512.488358924" Mar 08 00:30:58 crc kubenswrapper[4713]: I0308 00:30:58.400799 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.322909 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.324562 4713 generic.go:334] "Generic (PLEG): container finished" podID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerID="fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3" exitCode=1 Mar 08 00:30:59 crc kubenswrapper[4713]: I0308 00:30:59.324624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3"} Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.332367 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntck8" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" containerID="cri-o://d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" gracePeriod=2 Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.563358 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.564035 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.693575 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695043 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695084 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695177 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695225 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695337 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695429 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695636 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695680 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695700 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695703 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") pod \"d3c490af-d8bd-4659-b51d-6aec80c439c8\" (UID: \"d3c490af-d8bd-4659-b51d-6aec80c439c8\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.695756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696045 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696167 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696200 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.696215 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3c490af-d8bd-4659-b51d-6aec80c439c8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.697064 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities" (OuterVolumeSpecName: "utilities") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.697800 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698066 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698438 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698742 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.698971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.699024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.701805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.701873 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v" (OuterVolumeSpecName: "kube-api-access-gzx9v") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "kube-api-access-gzx9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.702172 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "d3c490af-d8bd-4659-b51d-6aec80c439c8" (UID: "d3c490af-d8bd-4659-b51d-6aec80c439c8"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.796783 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.796863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") pod \"e89ade9c-892d-466e-bfaa-f45237078d28\" (UID: \"e89ade9c-892d-466e-bfaa-f45237078d28\") " Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797117 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797130 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797140 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzx9v\" (UniqueName: \"kubernetes.io/projected/d3c490af-d8bd-4659-b51d-6aec80c439c8-kube-api-access-gzx9v\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797149 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797161 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797170 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797373 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/d3c490af-d8bd-4659-b51d-6aec80c439c8-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797381 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797389 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3c490af-d8bd-4659-b51d-6aec80c439c8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.797397 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.800680 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5" (OuterVolumeSpecName: "kube-api-access-j6mb5") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "kube-api-access-j6mb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.852571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e89ade9c-892d-466e-bfaa-f45237078d28" (UID: "e89ade9c-892d-466e-bfaa-f45237078d28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.898726 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89ade9c-892d-466e-bfaa-f45237078d28-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:00 crc kubenswrapper[4713]: I0308 00:31:00.898768 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mb5\" (UniqueName: \"kubernetes.io/projected/e89ade9c-892d-466e-bfaa-f45237078d28-kube-api-access-j6mb5\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341079 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89ade9c-892d-466e-bfaa-f45237078d28" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" exitCode=0 Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341182 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntck8" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntck8" event={"ID":"e89ade9c-892d-466e-bfaa-f45237078d28","Type":"ContainerDied","Data":"cb867c50000d8b99afc6f684e5f719dee646322428c1a08e8dabaed01ce9d7d2"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.341750 4713 scope.go:117] "RemoveContainer" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343200 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_d3c490af-d8bd-4659-b51d-6aec80c439c8/docker-build/0.log" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"d3c490af-d8bd-4659-b51d-6aec80c439c8","Type":"ContainerDied","Data":"faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900"} Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343753 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faabe974ce3bd14531f953a59ae1d347ed0c1ab0c8fbb5df3c5801d95e48c900" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.343808 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.369521 4713 scope.go:117] "RemoveContainer" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.417080 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.422173 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntck8"] Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.427607 4713 scope.go:117] "RemoveContainer" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444000 4713 scope.go:117] "RemoveContainer" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.444514 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": container with ID starting with d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d not found: ID does not exist" containerID="d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444548 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d"} err="failed to get container status \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": rpc error: code = NotFound desc = could not find container \"d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d\": container with ID starting with d282775e1e74077693a918e8838236835dd07b5d6bc464670e0fbcb348261c3d not found: ID does not exist" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.444569 4713 scope.go:117] "RemoveContainer" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.445140 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": container with ID starting with 27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff not found: ID does not exist" containerID="27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445195 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff"} err="failed to get container status \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": rpc error: code = NotFound desc = could not find container \"27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff\": container with ID starting with 27b86b2af86fd38b42111200b9e048dd6a75f0ef35854300e5908e9b58b8acff not found: ID does not exist" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445230 4713 scope.go:117] "RemoveContainer" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: E0308 00:31:01.445559 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": container with ID starting with 5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4 not found: ID does not exist" containerID="5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4" Mar 08 00:31:01 crc kubenswrapper[4713]: I0308 00:31:01.445589 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4"} err="failed to get container status \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": rpc error: code = NotFound desc = could not find container \"5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4\": container with ID starting with 5f03d3220ec8d81401dd84eb9a971c62c0f5e5428b77b5f03d27706e5b80e4a4 not found: ID does not exist" Mar 08 00:31:02 crc kubenswrapper[4713]: I0308 00:31:02.550609 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" path="/var/lib/kubelet/pods/e89ade9c-892d-466e-bfaa-f45237078d28/volumes" Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.056209 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.063085 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 08 00:31:06 crc kubenswrapper[4713]: I0308 00:31:06.555367 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" path="/var/lib/kubelet/pods/d3c490af-d8bd-4659-b51d-6aec80c439c8/volumes" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.633935 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634342 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-utilities" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634360 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-utilities" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634376 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634382 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634396 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-content" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634402 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="extract-content" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634412 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="manage-dockerfile" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="manage-dockerfile" Mar 08 00:31:07 crc kubenswrapper[4713]: E0308 00:31:07.634428 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634433 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634534 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89ade9c-892d-466e-bfaa-f45237078d28" containerName="registry-server" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.634548 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c490af-d8bd-4659-b51d-6aec80c439c8" containerName="docker-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.637219 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643423 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643598 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643684 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.643882 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.645482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804379 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804603 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804641 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804681 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.804707 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906191 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906338 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906484 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906544 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906884 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906910 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.906951 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.907671 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908446 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.908451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.913101 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.917542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.923481 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:07 crc kubenswrapper[4713]: I0308 00:31:07.954063 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:08 crc kubenswrapper[4713]: I0308 00:31:08.151258 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 08 00:31:08 crc kubenswrapper[4713]: I0308 00:31:08.392739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c"} Mar 08 00:31:09 crc kubenswrapper[4713]: I0308 00:31:09.399934 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c"} Mar 08 00:31:10 crc kubenswrapper[4713]: I0308 00:31:10.407931 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c" exitCode=0 Mar 08 00:31:10 crc kubenswrapper[4713]: I0308 00:31:10.408037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"47decae92d17c21a2177c133a5d16c644933bb532b7b55fff3cd35090a2adb3c"} Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.415747 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="f5d5ac707738aa5e892d7bcbb3f9759e8231d82d6983c233b86849777da5ccaa" exitCode=0 Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.416075 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"f5d5ac707738aa5e892d7bcbb3f9759e8231d82d6983c233b86849777da5ccaa"} Mar 08 00:31:11 crc kubenswrapper[4713]: I0308 00:31:11.460216 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_84b3ed06-5d45-4c0f-a4b4-bec838490219/manage-dockerfile/0.log" Mar 08 00:31:12 crc kubenswrapper[4713]: I0308 00:31:12.423612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerStarted","Data":"2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee"} Mar 08 00:31:12 crc kubenswrapper[4713]: I0308 00:31:12.457673 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.457651641 podStartE2EDuration="5.457651641s" podCreationTimestamp="2026-03-08 00:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:12.451374754 +0000 UTC m=+1526.571007007" watchObservedRunningTime="2026-03-08 00:31:12.457651641 +0000 UTC m=+1526.577283874" Mar 08 00:31:14 crc kubenswrapper[4713]: I0308 00:31:14.440486 4713 generic.go:334] "Generic (PLEG): container finished" podID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerID="2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee" exitCode=0 Mar 08 00:31:14 crc kubenswrapper[4713]: I0308 00:31:14.440543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"2414699b5905b9face59685a2ba3cebf47868ce7ce3bf47b1261c2b3eb36d1ee"} Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.740985 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.916718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917945 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917979 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918008 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.918083 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.917866 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919048 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919091 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919132 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919186 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") pod \"84b3ed06-5d45-4c0f-a4b4-bec838490219\" (UID: \"84b3ed06-5d45-4c0f-a4b4-bec838490219\") " Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919241 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919652 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919911 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919929 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919943 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919954 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.919964 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.920016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.920092 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.921623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j" (OuterVolumeSpecName: "kube-api-access-4gx2j") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "kube-api-access-4gx2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923601 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.923636 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:15 crc kubenswrapper[4713]: I0308 00:31:15.924747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "84b3ed06-5d45-4c0f-a4b4-bec838490219" (UID: "84b3ed06-5d45-4c0f-a4b4-bec838490219"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021062 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021122 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021170 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021199 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021227 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/84b3ed06-5d45-4c0f-a4b4-bec838490219-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021246 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gx2j\" (UniqueName: \"kubernetes.io/projected/84b3ed06-5d45-4c0f-a4b4-bec838490219-kube-api-access-4gx2j\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.021263 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/84b3ed06-5d45-4c0f-a4b4-bec838490219-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"84b3ed06-5d45-4c0f-a4b4-bec838490219","Type":"ContainerDied","Data":"ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c"} Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455440 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed2ca7deb7d3bd4de4031839877bb86e8604fb184312c12ef9bb0197c61c0b3c" Mar 08 00:31:16 crc kubenswrapper[4713]: I0308 00:31:16.455450 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495035 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495653 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="manage-dockerfile" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495671 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="manage-dockerfile" Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495681 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="git-clone" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495689 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="git-clone" Mar 08 00:31:19 crc kubenswrapper[4713]: E0308 00:31:19.495711 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495719 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.495885 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b3ed06-5d45-4c0f-a4b4-bec838490219" containerName="docker-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.496570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499761 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499919 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.499951 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.500292 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.513501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666733 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666779 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.666978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.667001 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.667030 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.768551 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.768983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769366 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769193 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769666 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769780 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.769958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770020 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770340 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.770791 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.771295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.772608 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.775157 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.775170 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.789508 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:19 crc kubenswrapper[4713]: I0308 00:31:19.813460 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.011322 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.481887 4713 generic.go:334] "Generic (PLEG): container finished" podID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerID="c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0" exitCode=0 Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.481944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0"} Mar 08 00:31:20 crc kubenswrapper[4713]: I0308 00:31:20.482168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerStarted","Data":"b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de"} Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.490987 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.491453 4713 generic.go:334] "Generic (PLEG): container finished" podID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerID="a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20" exitCode=1 Mar 08 00:31:21 crc kubenswrapper[4713]: I0308 00:31:21.491492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20"} Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.700710 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.701275 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807712 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807874 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807907 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807912 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.807932 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808086 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808127 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808154 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808196 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808226 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") pod \"41a78c77-3173-4e12-b68e-a9421ccb4298\" (UID: \"41a78c77-3173-4e12-b68e-a9421ccb4298\") " Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808449 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808508 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808578 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808590 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808598 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/41a78c77-3173-4e12-b68e-a9421ccb4298-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.808947 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809272 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809296 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809495 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.809752 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813297 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813394 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.813660 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5" (OuterVolumeSpecName: "kube-api-access-482h5") pod "41a78c77-3173-4e12-b68e-a9421ccb4298" (UID: "41a78c77-3173-4e12-b68e-a9421ccb4298"). InnerVolumeSpecName "kube-api-access-482h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909284 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909324 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909337 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-482h5\" (UniqueName: \"kubernetes.io/projected/41a78c77-3173-4e12-b68e-a9421ccb4298-kube-api-access-482h5\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909345 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909354 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909363 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/41a78c77-3173-4e12-b68e-a9421ccb4298-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909373 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/41a78c77-3173-4e12-b68e-a9421ccb4298-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909383 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:22 crc kubenswrapper[4713]: I0308 00:31:22.909395 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/41a78c77-3173-4e12-b68e-a9421ccb4298-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.504328 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_41a78c77-3173-4e12-b68e-a9421ccb4298/docker-build/0.log" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505091 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"41a78c77-3173-4e12-b68e-a9421ccb4298","Type":"ContainerDied","Data":"b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de"} Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505131 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2aab60f46ae0fe29ee29d5026cab7a6c14b56493903f12295b6bde8dae8b9de" Mar 08 00:31:23 crc kubenswrapper[4713]: I0308 00:31:23.505167 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.000268 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.005789 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 08 00:31:30 crc kubenswrapper[4713]: I0308 00:31:30.549349 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" path="/var/lib/kubelet/pods/41a78c77-3173-4e12-b68e-a9421ccb4298/volumes" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.615544 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:31 crc kubenswrapper[4713]: E0308 00:31:31.616144 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="manage-dockerfile" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616161 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="manage-dockerfile" Mar 08 00:31:31 crc kubenswrapper[4713]: E0308 00:31:31.616186 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616194 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.616321 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a78c77-3173-4e12-b68e-a9421ccb4298" containerName="docker-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.617266 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619349 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619539 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619562 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.619601 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.643103 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.724727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.724956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725435 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.725588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827187 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827234 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827274 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827295 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827326 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827351 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827438 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827492 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.827892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828381 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828644 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.828863 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.829153 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.842454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.842783 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.845616 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:31 crc kubenswrapper[4713]: I0308 00:31:31.930810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.168441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.561341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856"} Mar 08 00:31:32 crc kubenswrapper[4713]: I0308 00:31:32.561384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d"} Mar 08 00:31:34 crc kubenswrapper[4713]: I0308 00:31:34.574538 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856" exitCode=0 Mar 08 00:31:34 crc kubenswrapper[4713]: I0308 00:31:34.574839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"57a8fe67e5a0289227ece8804d0a9f763244b1e53546172e25e72bce650ca856"} Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.583361 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="9b66fc43b9ecb8837083ebc5b05d6a1cc956eabe67d66e3c5d86e4e7327451a5" exitCode=0 Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.583465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"9b66fc43b9ecb8837083ebc5b05d6a1cc956eabe67d66e3c5d86e4e7327451a5"} Mar 08 00:31:35 crc kubenswrapper[4713]: I0308 00:31:35.625101 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_eb0ec6e5-4cc4-4c52-a320-a163af42eca6/manage-dockerfile/0.log" Mar 08 00:31:36 crc kubenswrapper[4713]: I0308 00:31:36.597740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerStarted","Data":"bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c"} Mar 08 00:31:36 crc kubenswrapper[4713]: I0308 00:31:36.636770 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.636748292 podStartE2EDuration="5.636748292s" podCreationTimestamp="2026-03-08 00:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:31:36.62836931 +0000 UTC m=+1550.748001563" watchObservedRunningTime="2026-03-08 00:31:36.636748292 +0000 UTC m=+1550.756380525" Mar 08 00:31:39 crc kubenswrapper[4713]: I0308 00:31:39.625446 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerID="bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c" exitCode=0 Mar 08 00:31:39 crc kubenswrapper[4713]: I0308 00:31:39.625490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"bb7b06753d12f2b66f7473e7fccd80076bc01abcef253cc3b1ff7bcaddce480c"} Mar 08 00:31:40 crc kubenswrapper[4713]: I0308 00:31:40.855753 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053411 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053484 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053553 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053603 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053644 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053674 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053702 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053730 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.053794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054122 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054170 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") pod \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\" (UID: \"eb0ec6e5-4cc4-4c52-a320-a163af42eca6\") " Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054296 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054466 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054648 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054668 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054679 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.054713 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.055080 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.055165 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.056280 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.056379 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.058183 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.059584 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.059920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.060124 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf" (OuterVolumeSpecName: "kube-api-access-gqwtf") pod "eb0ec6e5-4cc4-4c52-a320-a163af42eca6" (UID: "eb0ec6e5-4cc4-4c52-a320-a163af42eca6"). InnerVolumeSpecName "kube-api-access-gqwtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155188 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155221 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155233 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155245 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155256 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155264 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155273 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqwtf\" (UniqueName: \"kubernetes.io/projected/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-kube-api-access-gqwtf\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155282 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.155290 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eb0ec6e5-4cc4-4c52-a320-a163af42eca6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643538 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"eb0ec6e5-4cc4-4c52-a320-a163af42eca6","Type":"ContainerDied","Data":"df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d"} Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643611 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df639c98458e3df6cf7ef96ddee63017cdbf9ad7b00bd6af4b3f8230fbca306d" Mar 08 00:31:41 crc kubenswrapper[4713]: I0308 00:31:41.643609 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.834968 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835667 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835679 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835690 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="manage-dockerfile" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835697 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="manage-dockerfile" Mar 08 00:31:56 crc kubenswrapper[4713]: E0308 00:31:56.835708 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="git-clone" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="git-clone" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.835804 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0ec6e5-4cc4-4c52-a320-a163af42eca6" containerName="docker-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.836610 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839474 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839638 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.839869 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-ptp88" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.840723 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.840913 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856156 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856865 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856890 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.856996 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857053 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857077 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.857124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960428 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960462 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960513 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960537 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.960688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.961262 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.961749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962229 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962456 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962552 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.962877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.968579 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.969852 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.971985 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.975254 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:56 crc kubenswrapper[4713]: I0308 00:31:56.981471 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"service-telemetry-framework-index-1-build\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.154095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.378114 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.754390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e"} Mar 08 00:31:57 crc kubenswrapper[4713]: I0308 00:31:57.754450 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb"} Mar 08 00:31:58 crc kubenswrapper[4713]: I0308 00:31:58.762566 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e" exitCode=0 Mar 08 00:31:58 crc kubenswrapper[4713]: I0308 00:31:58.762614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"0904ca818df9cb0a3b1a7f6e4f990dfc98b1a5af9c3112ab7a99391125d44f3e"} Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.771067 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="5fe140ef81009f2c519e74f40fd71a40332e5dc01b26fb1ae49ef3ff0efa8c16" exitCode=0 Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.771118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"5fe140ef81009f2c519e74f40fd71a40332e5dc01b26fb1ae49ef3ff0efa8c16"} Mar 08 00:31:59 crc kubenswrapper[4713]: I0308 00:31:59.804607 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_a1f08b8e-b7bf-4e1a-934f-b3dd95201eab/manage-dockerfile/0.log" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.132803 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.133943 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136635 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136758 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.136846 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.141294 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.307339 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.408679 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.428680 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"auto-csr-approver-29548832-6k4lz\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.451409 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.641460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:32:00 crc kubenswrapper[4713]: W0308 00:32:00.651005 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a13b2b_064d_4323_8d5c_d86f76405f38.slice/crio-3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea WatchSource:0}: Error finding container 3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea: Status 404 returned error can't find the container with id 3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.779874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerStarted","Data":"0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c"} Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.782087 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerStarted","Data":"3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea"} Mar 08 00:32:00 crc kubenswrapper[4713]: I0308 00:32:00.821641 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.821620852 podStartE2EDuration="4.821620852s" podCreationTimestamp="2026-03-08 00:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:32:00.813535407 +0000 UTC m=+1574.933167650" watchObservedRunningTime="2026-03-08 00:32:00.821620852 +0000 UTC m=+1574.941253085" Mar 08 00:32:02 crc kubenswrapper[4713]: I0308 00:32:02.796351 4713 generic.go:334] "Generic (PLEG): container finished" podID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerID="d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255" exitCode=0 Mar 08 00:32:02 crc kubenswrapper[4713]: I0308 00:32:02.796410 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerDied","Data":"d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255"} Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.020178 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.154286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") pod \"d0a13b2b-064d-4323-8d5c-d86f76405f38\" (UID: \"d0a13b2b-064d-4323-8d5c-d86f76405f38\") " Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.159954 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr" (OuterVolumeSpecName: "kube-api-access-jwvbr") pod "d0a13b2b-064d-4323-8d5c-d86f76405f38" (UID: "d0a13b2b-064d-4323-8d5c-d86f76405f38"). InnerVolumeSpecName "kube-api-access-jwvbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.255216 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwvbr\" (UniqueName: \"kubernetes.io/projected/d0a13b2b-064d-4323-8d5c-d86f76405f38-kube-api-access-jwvbr\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" event={"ID":"d0a13b2b-064d-4323-8d5c-d86f76405f38","Type":"ContainerDied","Data":"3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea"} Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810898 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3405b0ac1a9177e914b5c6c6c23949359b1a7e5cfff2352a3ad2e15156c6a7ea" Mar 08 00:32:04 crc kubenswrapper[4713]: I0308 00:32:04.810615 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548832-6k4lz" Mar 08 00:32:05 crc kubenswrapper[4713]: I0308 00:32:05.076528 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:32:05 crc kubenswrapper[4713]: I0308 00:32:05.095718 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548826-fhk5r"] Mar 08 00:32:06 crc kubenswrapper[4713]: I0308 00:32:06.548396 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fc1987-0bdc-476c-9315-18ddbf570461" path="/var/lib/kubelet/pods/45fc1987-0bdc-476c-9315-18ddbf570461/volumes" Mar 08 00:32:15 crc kubenswrapper[4713]: I0308 00:32:15.294019 4713 scope.go:117] "RemoveContainer" containerID="76cb1ca43446adb6dc230f530d8737aea0a1011651185fc5861e17e4b5ae2a6c" Mar 08 00:32:28 crc kubenswrapper[4713]: I0308 00:32:28.956034 4713 generic.go:334] "Generic (PLEG): container finished" podID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerID="0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c" exitCode=0 Mar 08 00:32:28 crc kubenswrapper[4713]: I0308 00:32:28.956123 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"0ac54551228085177532c06723bf629c1b218e85b223f3abe113f31369692f4c"} Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.188785 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387287 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387365 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387457 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387508 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387446 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387716 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387789 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387874 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387911 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.387955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") pod \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\" (UID: \"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab\") " Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.388483 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389178 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.389373 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390653 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.390789 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.393849 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push" (OuterVolumeSpecName: "builder-dockercfg-ptp88-push") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "builder-dockercfg-ptp88-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.394171 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.394729 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull" (OuterVolumeSpecName: "builder-dockercfg-ptp88-pull") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "builder-dockercfg-ptp88-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.395781 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr" (OuterVolumeSpecName: "kube-api-access-lpmmr") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "kube-api-access-lpmmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489136 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489353 4713 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489442 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489531 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489623 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-push\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-push\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489697 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489772 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489864 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmmr\" (UniqueName: \"kubernetes.io/projected/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-kube-api-access-lpmmr\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.489950 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.490021 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-ptp88-pull\" (UniqueName: \"kubernetes.io/secret/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-builder-dockercfg-ptp88-pull\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.625321 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.698186 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969160 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"a1f08b8e-b7bf-4e1a-934f-b3dd95201eab","Type":"ContainerDied","Data":"68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb"} Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969195 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68961efedf2da8822c35e5e96f5b92fab19325b9a9c28b4dcb20edbc175d01cb" Mar 08 00:32:30 crc kubenswrapper[4713]: I0308 00:32:30.969550 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 08 00:32:31 crc kubenswrapper[4713]: I0308 00:32:31.389807 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" (UID: "a1f08b8e-b7bf-4e1a-934f-b3dd95201eab"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:32:31 crc kubenswrapper[4713]: I0308 00:32:31.405054 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a1f08b8e-b7bf-4e1a-934f-b3dd95201eab-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.170704 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.170992 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="manage-dockerfile" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171008 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="manage-dockerfile" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171021 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171028 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171040 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="git-clone" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171046 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="git-clone" Mar 08 00:32:32 crc kubenswrapper[4713]: E0308 00:32:32.171057 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171063 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171163 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f08b8e-b7bf-4e1a-934f-b3dd95201eab" containerName="docker-build" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171172 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" containerName="oc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.171574 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.176211 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-dkxrf" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.183282 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.215950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.317472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.349806 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"infrawatch-operators-qz9xc\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:32 crc kubenswrapper[4713]: I0308 00:32:32.488976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:33 crc kubenswrapper[4713]: I0308 00:32:33.868794 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:33 crc kubenswrapper[4713]: I0308 00:32:33.993977 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-qz9xc" event={"ID":"aab425e0-6643-4517-893f-6a638b8ae66d","Type":"ContainerStarted","Data":"0f0b7586d1a19f7104b1b86f7584f513a6655a21578c4d11fbfb55a6aaa1dd71"} Mar 08 00:32:34 crc kubenswrapper[4713]: I0308 00:32:34.500953 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:32:34 crc kubenswrapper[4713]: I0308 00:32:34.501337 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:32:36 crc kubenswrapper[4713]: I0308 00:32:36.974644 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.774092 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.775169 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.789072 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.890722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:37 crc kubenswrapper[4713]: I0308 00:32:37.992339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:38 crc kubenswrapper[4713]: I0308 00:32:38.022556 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpzp4\" (UniqueName: \"kubernetes.io/projected/40f96514-d597-436a-8158-0535f61fa6f8-kube-api-access-bpzp4\") pod \"infrawatch-operators-rx6bq\" (UID: \"40f96514-d597-436a-8158-0535f61fa6f8\") " pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:38 crc kubenswrapper[4713]: I0308 00:32:38.102253 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:44 crc kubenswrapper[4713]: I0308 00:32:44.691184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-rx6bq"] Mar 08 00:32:46 crc kubenswrapper[4713]: W0308 00:32:46.797298 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f96514_d597_436a_8158_0535f61fa6f8.slice/crio-fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb WatchSource:0}: Error finding container fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb: Status 404 returned error can't find the container with id fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.862293 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.862461 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9bsvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-qz9xc_service-telemetry(aab425e0-6643-4517-893f-6a638b8ae66d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 00:32:46 crc kubenswrapper[4713]: E0308 00:32:46.863632 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/infrawatch-operators-qz9xc" podUID="aab425e0-6643-4517-893f-6a638b8ae66d" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.090739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-rx6bq" event={"ID":"40f96514-d597-436a-8158-0535f61fa6f8","Type":"ContainerStarted","Data":"acd23d11a52b08e960c2e2ebe308e196e7d9017fcce21b077ff99189646db2cd"} Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.090889 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-rx6bq" event={"ID":"40f96514-d597-436a-8158-0535f61fa6f8","Type":"ContainerStarted","Data":"fd21ae6241edae87432b1ba3ed248ed5ef3d6e8aaeb3aa7a25cc35ba2fa77dbb"} Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.328300 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.345982 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-rx6bq" podStartSLOduration=10.234165673 podStartE2EDuration="10.345960743s" podCreationTimestamp="2026-03-08 00:32:37 +0000 UTC" firstStartedPulling="2026-03-08 00:32:46.80031753 +0000 UTC m=+1620.919949763" lastFinishedPulling="2026-03-08 00:32:46.9121126 +0000 UTC m=+1621.031744833" observedRunningTime="2026-03-08 00:32:47.125195629 +0000 UTC m=+1621.244827862" watchObservedRunningTime="2026-03-08 00:32:47.345960743 +0000 UTC m=+1621.465592996" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.512691 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") pod \"aab425e0-6643-4517-893f-6a638b8ae66d\" (UID: \"aab425e0-6643-4517-893f-6a638b8ae66d\") " Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.517688 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm" (OuterVolumeSpecName: "kube-api-access-9bsvm") pod "aab425e0-6643-4517-893f-6a638b8ae66d" (UID: "aab425e0-6643-4517-893f-6a638b8ae66d"). InnerVolumeSpecName "kube-api-access-9bsvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:32:47 crc kubenswrapper[4713]: I0308 00:32:47.614435 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bsvm\" (UniqueName: \"kubernetes.io/projected/aab425e0-6643-4517-893f-6a638b8ae66d-kube-api-access-9bsvm\") on node \"crc\" DevicePath \"\"" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.097617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-qz9xc" event={"ID":"aab425e0-6643-4517-893f-6a638b8ae66d","Type":"ContainerDied","Data":"0f0b7586d1a19f7104b1b86f7584f513a6655a21578c4d11fbfb55a6aaa1dd71"} Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.097662 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-qz9xc" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.102601 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.102640 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.134478 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.152710 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.159612 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-qz9xc"] Mar 08 00:32:48 crc kubenswrapper[4713]: I0308 00:32:48.550522 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab425e0-6643-4517-893f-6a638b8ae66d" path="/var/lib/kubelet/pods/aab425e0-6643-4517-893f-6a638b8ae66d/volumes" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.780145 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.782766 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.794465 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.920737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.920855 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:55 crc kubenswrapper[4713]: I0308 00:32:55.921390 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022418 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022567 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.022926 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.023055 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.052736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"community-operators-svj4c\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.112709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:32:56 crc kubenswrapper[4713]: I0308 00:32:56.552398 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.160882 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" exitCode=0 Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.162029 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785"} Mar 08 00:32:57 crc kubenswrapper[4713]: I0308 00:32:57.162344 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"9f82c3dff1939e1e71812e7fa1c087d46f2fb778390ac98acb85ef0038e4d1b2"} Mar 08 00:32:58 crc kubenswrapper[4713]: I0308 00:32:58.132331 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-rx6bq" Mar 08 00:32:58 crc kubenswrapper[4713]: I0308 00:32:58.174424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.181751 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" exitCode=0 Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.181930 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.815441 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.816923 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.830029 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874161 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.874257 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975198 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.975968 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.976291 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:32:59 crc kubenswrapper[4713]: I0308 00:32:59.994763 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.142256 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.195704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerStarted","Data":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.210360 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svj4c" podStartSLOduration=2.816717234 podStartE2EDuration="5.210337641s" podCreationTimestamp="2026-03-08 00:32:55 +0000 UTC" firstStartedPulling="2026-03-08 00:32:57.166241607 +0000 UTC m=+1631.285873840" lastFinishedPulling="2026-03-08 00:32:59.559862014 +0000 UTC m=+1633.679494247" observedRunningTime="2026-03-08 00:33:00.209330134 +0000 UTC m=+1634.328962397" watchObservedRunningTime="2026-03-08 00:33:00.210337641 +0000 UTC m=+1634.329969874" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.559361 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8"] Mar 08 00:33:00 crc kubenswrapper[4713]: W0308 00:33:00.559432 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3468be8a_1655_46bd_869e_a1f4653984f1.slice/crio-367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b WatchSource:0}: Error finding container 367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b: Status 404 returned error can't find the container with id 367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.831252 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.832804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.843552 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885703 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885747 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.885770 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.987658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.988106 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:00 crc kubenswrapper[4713]: I0308 00:33:00.988323 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.014840 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.186286 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.213468 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="add7fb42d76faa64b6aeb50aea2a78e03bf22d54af5bbda2ea40f47b22633147" exitCode=0 Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.214240 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"add7fb42d76faa64b6aeb50aea2a78e03bf22d54af5bbda2ea40f47b22633147"} Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.214292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerStarted","Data":"367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b"} Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.218472 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:33:01 crc kubenswrapper[4713]: I0308 00:33:01.394720 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h"] Mar 08 00:33:01 crc kubenswrapper[4713]: W0308 00:33:01.486338 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47eaf9e_75d1_40eb_8671_0ebc9ca47520.slice/crio-70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652 WatchSource:0}: Error finding container 70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652: Status 404 returned error can't find the container with id 70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.221944 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="a43b41dbd861e871f5b111d9733e70466dca8e98a32ae0a6001284be33a60d23" exitCode=0 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.222293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"a43b41dbd861e871f5b111d9733e70466dca8e98a32ae0a6001284be33a60d23"} Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226368 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="05d0b9d876c6f453a46a6ed447b8a8ce4f6de5109efd4222aac4fb2454f59dbc" exitCode=0 Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226414 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"05d0b9d876c6f453a46a6ed447b8a8ce4f6de5109efd4222aac4fb2454f59dbc"} Mar 08 00:33:02 crc kubenswrapper[4713]: I0308 00:33:02.226433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerStarted","Data":"70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652"} Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.240276 4713 generic.go:334] "Generic (PLEG): container finished" podID="3468be8a-1655-46bd-869e-a1f4653984f1" containerID="5012414fcce13bcdc53ece6374103d9afa70ccdbaf616a7e5f092f2167af931a" exitCode=0 Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.240345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"5012414fcce13bcdc53ece6374103d9afa70ccdbaf616a7e5f092f2167af931a"} Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.244762 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="50c3f41e381cc6a47b78b7dc7f983678a1c1769067c43239a6cb4196bfbab5c2" exitCode=0 Mar 08 00:33:03 crc kubenswrapper[4713]: I0308 00:33:03.244804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"50c3f41e381cc6a47b78b7dc7f983678a1c1769067c43239a6cb4196bfbab5c2"} Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.252173 4713 generic.go:334] "Generic (PLEG): container finished" podID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerID="66c274bef3a35763566cb4084532025a581f2bf1d99a6c94c69c7883f0d852dd" exitCode=0 Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.252381 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"66c274bef3a35763566cb4084532025a581f2bf1d99a6c94c69c7883f0d852dd"} Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.475585 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.500659 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.500725 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.539525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.540879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle" (OuterVolumeSpecName: "bundle") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640399 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") pod \"3468be8a-1655-46bd-869e-a1f4653984f1\" (UID: \"3468be8a-1655-46bd-869e-a1f4653984f1\") " Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.640802 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.646105 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4" (OuterVolumeSpecName: "kube-api-access-pz8k4") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "kube-api-access-pz8k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.661075 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util" (OuterVolumeSpecName: "util") pod "3468be8a-1655-46bd-869e-a1f4653984f1" (UID: "3468be8a-1655-46bd-869e-a1f4653984f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.742052 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz8k4\" (UniqueName: \"kubernetes.io/projected/3468be8a-1655-46bd-869e-a1f4653984f1-kube-api-access-pz8k4\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:04 crc kubenswrapper[4713]: I0308 00:33:04.742288 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3468be8a-1655-46bd-869e-a1f4653984f1-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" event={"ID":"3468be8a-1655-46bd-869e-a1f4653984f1","Type":"ContainerDied","Data":"367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b"} Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266333 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367b7c56b758b0fc0049526f46484e41a79cb6b57cc79d878998fb75f5545b9b" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.266397 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c098npc8" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.534292 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.674808 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675096 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675181 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") pod \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\" (UID: \"e47eaf9e-75d1-40eb-8671-0ebc9ca47520\") " Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.675585 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle" (OuterVolumeSpecName: "bundle") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.685006 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp" (OuterVolumeSpecName: "kube-api-access-sm5dp") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "kube-api-access-sm5dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.693798 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util" (OuterVolumeSpecName: "util") pod "e47eaf9e-75d1-40eb-8671-0ebc9ca47520" (UID: "e47eaf9e-75d1-40eb-8671-0ebc9ca47520"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.776985 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-util\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.777018 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:05 crc kubenswrapper[4713]: I0308 00:33:05.777031 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5dp\" (UniqueName: \"kubernetes.io/projected/e47eaf9e-75d1-40eb-8671-0ebc9ca47520-kube-api-access-sm5dp\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.112960 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.113013 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.153691 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276323 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ax2d6h" event={"ID":"e47eaf9e-75d1-40eb-8671-0ebc9ca47520","Type":"ContainerDied","Data":"70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652"} Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.276420 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f10667c84b68b5828294224680e99f6a19bfaaa95456d3550eb940b7ea4652" Mar 08 00:33:06 crc kubenswrapper[4713]: I0308 00:33:06.317595 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.367550 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.368049 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svj4c" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" containerID="cri-o://ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" gracePeriod=2 Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.699965 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.821582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") pod \"dca9cde6-7c79-47bd-aacc-d326268e5595\" (UID: \"dca9cde6-7c79-47bd-aacc-d326268e5595\") " Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.822341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities" (OuterVolumeSpecName: "utilities") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.822787 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.827096 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25" (OuterVolumeSpecName: "kube-api-access-5xw25") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "kube-api-access-5xw25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.880129 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dca9cde6-7c79-47bd-aacc-d326268e5595" (UID: "dca9cde6-7c79-47bd-aacc-d326268e5595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.924047 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xw25\" (UniqueName: \"kubernetes.io/projected/dca9cde6-7c79-47bd-aacc-d326268e5595-kube-api-access-5xw25\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:08 crc kubenswrapper[4713]: I0308 00:33:08.924076 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dca9cde6-7c79-47bd-aacc-d326268e5595-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.295997 4713 generic.go:334] "Generic (PLEG): container finished" podID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" exitCode=0 Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svj4c" event={"ID":"dca9cde6-7c79-47bd-aacc-d326268e5595","Type":"ContainerDied","Data":"9f82c3dff1939e1e71812e7fa1c087d46f2fb778390ac98acb85ef0038e4d1b2"} Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296312 4713 scope.go:117] "RemoveContainer" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.296084 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svj4c" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.330876 4713 scope.go:117] "RemoveContainer" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.334762 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.339787 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svj4c"] Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.358803 4713 scope.go:117] "RemoveContainer" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.373679 4713 scope.go:117] "RemoveContainer" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.374182 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": container with ID starting with ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b not found: ID does not exist" containerID="ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.374227 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b"} err="failed to get container status \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": rpc error: code = NotFound desc = could not find container \"ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b\": container with ID starting with ee7835fb72e5d8baa1fd18b584ef3b9ccbe7fd71ca43e706485e4854b1e7b19b not found: ID does not exist" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.374259 4713 scope.go:117] "RemoveContainer" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.375404 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": container with ID starting with ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a not found: ID does not exist" containerID="ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375448 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a"} err="failed to get container status \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": rpc error: code = NotFound desc = could not find container \"ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a\": container with ID starting with ca2ec636ba7fe1f82e7f359ccdbed3ec39b94744bbafbc17519b011d5fb3967a not found: ID does not exist" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375473 4713 scope.go:117] "RemoveContainer" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: E0308 00:33:09.375714 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": container with ID starting with 99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785 not found: ID does not exist" containerID="99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785" Mar 08 00:33:09 crc kubenswrapper[4713]: I0308 00:33:09.375748 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785"} err="failed to get container status \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": rpc error: code = NotFound desc = could not find container \"99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785\": container with ID starting with 99596bdc3f5d694747953229651ae0f0cae64257f60bc1a5f0c511f98c9e4785 not found: ID does not exist" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.549131 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" path="/var/lib/kubelet/pods/dca9cde6-7c79-47bd-aacc-d326268e5595/volumes" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.908764 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909263 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-content" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909275 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-content" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909283 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909289 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909298 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909305 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909316 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-utilities" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909323 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="extract-utilities" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909334 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909339 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909348 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909353 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909365 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909371 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="pull" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909380 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909386 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="util" Mar 08 00:33:10 crc kubenswrapper[4713]: E0308 00:33:10.909395 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909400 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909493 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47eaf9e-75d1-40eb-8671-0ebc9ca47520" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909509 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca9cde6-7c79-47bd-aacc-d326268e5595" containerName="registry-server" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909517 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3468be8a-1655-46bd-869e-a1f4653984f1" containerName="extract" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.909898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.912428 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-h6xd8" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.949247 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.949407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:10 crc kubenswrapper[4713]: I0308 00:33:10.959961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.050636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/934a7934-e52f-4279-9c2a-4255daf78d5a-runner\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.075734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5fj\" (UniqueName: \"kubernetes.io/projected/934a7934-e52f-4279-9c2a-4255daf78d5a-kube-api-access-lh5fj\") pod \"smart-gateway-operator-795859486c-d7k9q\" (UID: \"934a7934-e52f-4279-9c2a-4255daf78d5a\") " pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.229366 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" Mar 08 00:33:11 crc kubenswrapper[4713]: I0308 00:33:11.676206 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-795859486c-d7k9q"] Mar 08 00:33:11 crc kubenswrapper[4713]: W0308 00:33:11.681760 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod934a7934_e52f_4279_9c2a_4255daf78d5a.slice/crio-134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6 WatchSource:0}: Error finding container 134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6: Status 404 returned error can't find the container with id 134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6 Mar 08 00:33:12 crc kubenswrapper[4713]: I0308 00:33:12.319986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" event={"ID":"934a7934-e52f-4279-9c2a-4255daf78d5a","Type":"ContainerStarted","Data":"134b6af8978f8e78c7363be5f8f154fa3f965c441c70a1c2db665a8a04a79dd6"} Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.911095 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.912001 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.912063 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:13 crc kubenswrapper[4713]: I0308 00:33:13.917280 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-rwbl6" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.019804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.020172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133149 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.133992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c714eef0-0fe5-4836-80e1-c640aa9527e7-runner\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.159972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwzc\" (UniqueName: \"kubernetes.io/projected/c714eef0-0fe5-4836-80e1-c640aa9527e7-kube-api-access-hwwzc\") pod \"service-telemetry-operator-6f9dc9fb4b-dzbm4\" (UID: \"c714eef0-0fe5-4836-80e1-c640aa9527e7\") " pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.234556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" Mar 08 00:33:14 crc kubenswrapper[4713]: I0308 00:33:14.427779 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4"] Mar 08 00:33:15 crc kubenswrapper[4713]: I0308 00:33:15.350647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" event={"ID":"c714eef0-0fe5-4836-80e1-c640aa9527e7","Type":"ContainerStarted","Data":"9737dd344b4808d0d70e88f2bd07c93cf44ed1c47bfa5c875aa40d32d36f57e3"} Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.466938 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.467717 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1772929848,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lh5fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-795859486c-d7k9q_service-telemetry(934a7934-e52f-4279-9c2a-4255daf78d5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 00:33:27 crc kubenswrapper[4713]: E0308 00:33:27.468926 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podUID="934a7934-e52f-4279-9c2a-4255daf78d5a" Mar 08 00:33:28 crc kubenswrapper[4713]: E0308 00:33:28.458185 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podUID="934a7934-e52f-4279-9c2a-4255daf78d5a" Mar 08 00:33:33 crc kubenswrapper[4713]: I0308 00:33:33.498296 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" event={"ID":"c714eef0-0fe5-4836-80e1-c640aa9527e7","Type":"ContainerStarted","Data":"a6d41f541dc39b5caf5a4c1055633279153e411158c2089867c891f8a910d42f"} Mar 08 00:33:33 crc kubenswrapper[4713]: I0308 00:33:33.516396 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-6f9dc9fb4b-dzbm4" podStartSLOduration=2.542619887 podStartE2EDuration="20.516377944s" podCreationTimestamp="2026-03-08 00:33:13 +0000 UTC" firstStartedPulling="2026-03-08 00:33:14.434716931 +0000 UTC m=+1648.554349164" lastFinishedPulling="2026-03-08 00:33:32.408474988 +0000 UTC m=+1666.528107221" observedRunningTime="2026-03-08 00:33:33.515807379 +0000 UTC m=+1667.635439632" watchObservedRunningTime="2026-03-08 00:33:33.516377944 +0000 UTC m=+1667.636010167" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500305 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500377 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.500429 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.501149 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:33:34 crc kubenswrapper[4713]: I0308 00:33:34.501209 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" gracePeriod=600 Mar 08 00:33:34 crc kubenswrapper[4713]: E0308 00:33:34.618742 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.523955 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" exitCode=0 Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b"} Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524428 4713 scope.go:117] "RemoveContainer" containerID="bbcc55077b8279f43ab1318272be3487b4b4457dea7182ea0e9d79f49619de4c" Mar 08 00:33:35 crc kubenswrapper[4713]: I0308 00:33:35.524960 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:33:35 crc kubenswrapper[4713]: E0308 00:33:35.525175 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:42 crc kubenswrapper[4713]: I0308 00:33:42.570108 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" event={"ID":"934a7934-e52f-4279-9c2a-4255daf78d5a","Type":"ContainerStarted","Data":"55d59f49b25ee77591ecf1d954ac0737b918ba0688322fb82ae0f4139f4d3519"} Mar 08 00:33:42 crc kubenswrapper[4713]: I0308 00:33:42.596470 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-795859486c-d7k9q" podStartSLOduration=2.213295157 podStartE2EDuration="32.596443628s" podCreationTimestamp="2026-03-08 00:33:10 +0000 UTC" firstStartedPulling="2026-03-08 00:33:11.683561097 +0000 UTC m=+1645.803193330" lastFinishedPulling="2026-03-08 00:33:42.066709568 +0000 UTC m=+1676.186341801" observedRunningTime="2026-03-08 00:33:42.589416041 +0000 UTC m=+1676.709048274" watchObservedRunningTime="2026-03-08 00:33:42.596443628 +0000 UTC m=+1676.716075871" Mar 08 00:33:46 crc kubenswrapper[4713]: I0308 00:33:46.544286 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:33:46 crc kubenswrapper[4713]: E0308 00:33:46.544782 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.439164 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.440607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.442414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443096 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-8dc86" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443097 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443262 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.443869 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.445791 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.447606 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.458558 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568902 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568929 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.568978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569197 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.569280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670168 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670190 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670212 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.670298 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.671279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.676314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.677770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.685242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.686716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.687474 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.703033 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"default-interconnect-68864d46cb-t7lzv\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:53 crc kubenswrapper[4713]: I0308 00:33:53.765203 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:33:54 crc kubenswrapper[4713]: I0308 00:33:54.211084 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:33:54 crc kubenswrapper[4713]: I0308 00:33:54.650429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerStarted","Data":"71917d86375943e31a9292ae7412991594bcc498f11ed7d30ee0bdc265d89c06"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.141182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.142198 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144575 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144752 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.144940 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.151748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.160085 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.260864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.291915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"auto-csr-approver-29548834-njxhh\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.465780 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.549673 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:00 crc kubenswrapper[4713]: E0308 00:34:00.550144 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.652556 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:34:00 crc kubenswrapper[4713]: W0308 00:34:00.653488 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef90820d_fdcc_4ff1_97db_756e8c96851a.slice/crio-fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d WatchSource:0}: Error finding container fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d: Status 404 returned error can't find the container with id fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.703807 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerStarted","Data":"fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.705053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerStarted","Data":"2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3"} Mar 08 00:34:00 crc kubenswrapper[4713]: I0308 00:34:00.721011 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" podStartSLOduration=1.914970427 podStartE2EDuration="7.72099367s" podCreationTimestamp="2026-03-08 00:33:53 +0000 UTC" firstStartedPulling="2026-03-08 00:33:54.214460911 +0000 UTC m=+1688.334093144" lastFinishedPulling="2026-03-08 00:34:00.020484124 +0000 UTC m=+1694.140116387" observedRunningTime="2026-03-08 00:34:00.720030335 +0000 UTC m=+1694.839662588" watchObservedRunningTime="2026-03-08 00:34:00.72099367 +0000 UTC m=+1694.840625903" Mar 08 00:34:01 crc kubenswrapper[4713]: I0308 00:34:01.714681 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerStarted","Data":"54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3"} Mar 08 00:34:01 crc kubenswrapper[4713]: I0308 00:34:01.729906 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548834-njxhh" podStartSLOduration=0.91415066 podStartE2EDuration="1.729884406s" podCreationTimestamp="2026-03-08 00:34:00 +0000 UTC" firstStartedPulling="2026-03-08 00:34:00.657032371 +0000 UTC m=+1694.776664604" lastFinishedPulling="2026-03-08 00:34:01.472766097 +0000 UTC m=+1695.592398350" observedRunningTime="2026-03-08 00:34:01.724880454 +0000 UTC m=+1695.844512707" watchObservedRunningTime="2026-03-08 00:34:01.729884406 +0000 UTC m=+1695.849516629" Mar 08 00:34:02 crc kubenswrapper[4713]: I0308 00:34:02.727000 4713 generic.go:334] "Generic (PLEG): container finished" podID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerID="54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3" exitCode=0 Mar 08 00:34:02 crc kubenswrapper[4713]: I0308 00:34:02.727057 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerDied","Data":"54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3"} Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.772247 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.773901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778222 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778345 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778476 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778584 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778694 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778744 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.778902 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-78nxz" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.779003 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.779084 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.792813 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.910984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911003 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911048 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.911168 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:03 crc kubenswrapper[4713]: I0308 00:34:03.997659 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.012774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013280 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013745 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013804 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013851 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.013961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.014006 4713 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.014100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.014143 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls podName:cf91b8a6-24ec-4c39-8337-f05acf19e199 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:04.514108128 +0000 UTC m=+1698.633740371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "cf91b8a6-24ec-4c39-8337-f05acf19e199") : secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.015439 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.016331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf91b8a6-24ec-4c39-8337-f05acf19e199-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.020479 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.020513 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f960d04c1718d4ca7632e6054426d041bf9e016104b49269b2d10d057333c68/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021663 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-web-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.021914 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-config\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.022789 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-tls-assets\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.024130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf91b8a6-24ec-4c39-8337-f05acf19e199-config-out\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.044724 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwbp\" (UniqueName: \"kubernetes.io/projected/cf91b8a6-24ec-4c39-8337-f05acf19e199-kube-api-access-qbwbp\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.049877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c0e12a34-f5ae-4cbb-8e85-5b0ba7390133\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.115631 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") pod \"ef90820d-fdcc-4ff1-97db-756e8c96851a\" (UID: \"ef90820d-fdcc-4ff1-97db-756e8c96851a\") " Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.118869 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc" (OuterVolumeSpecName: "kube-api-access-9p7jc") pod "ef90820d-fdcc-4ff1-97db-756e8c96851a" (UID: "ef90820d-fdcc-4ff1-97db-756e8c96851a"). InnerVolumeSpecName "kube-api-access-9p7jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.217525 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p7jc\" (UniqueName: \"kubernetes.io/projected/ef90820d-fdcc-4ff1-97db-756e8c96851a-kube-api-access-9p7jc\") on node \"crc\" DevicePath \"\"" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.521655 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.521906 4713 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: E0308 00:34:04.521963 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls podName:cf91b8a6-24ec-4c39-8337-f05acf19e199 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:05.521944996 +0000 UTC m=+1699.641577239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "cf91b8a6-24ec-4c39-8337-f05acf19e199") : secret "default-prometheus-proxy-tls" not found Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.754893 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548834-njxhh" event={"ID":"ef90820d-fdcc-4ff1-97db-756e8c96851a","Type":"ContainerDied","Data":"fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d"} Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.754934 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd2b7ee2d0f0af78b41892d65cdb0f57c93d9a4db60f2fd5702516f644cabf6d" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.755001 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548834-njxhh" Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.779928 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:34:04 crc kubenswrapper[4713]: I0308 00:34:04.786661 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548828-b8fft"] Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.535849 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.543136 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf91b8a6-24ec-4c39-8337-f05acf19e199-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"cf91b8a6-24ec-4c39-8337-f05acf19e199\") " pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.597052 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:05 crc kubenswrapper[4713]: I0308 00:34:05.818705 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 08 00:34:06 crc kubenswrapper[4713]: I0308 00:34:06.548249 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f9ab32-0c71-4b60-b499-75b2f4f4dcf3" path="/var/lib/kubelet/pods/91f9ab32-0c71-4b60-b499-75b2f4f4dcf3/volumes" Mar 08 00:34:06 crc kubenswrapper[4713]: I0308 00:34:06.776645 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"afa3c68f33fcf026ae023a61d4786edb94046d5cecf6df345b66c62165522196"} Mar 08 00:34:09 crc kubenswrapper[4713]: I0308 00:34:09.797684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a"} Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653163 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:13 crc kubenswrapper[4713]: E0308 00:34:13.653431 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653443 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653545 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" containerName="oc" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.653969 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.668740 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.758228 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.858934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.882335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gh5\" (UniqueName: \"kubernetes.io/projected/6bdaeb5b-32b1-4454-9a68-0893de41cc75-kube-api-access-b6gh5\") pod \"default-snmp-webhook-6856cfb745-lfj62\" (UID: \"6bdaeb5b-32b1-4454-9a68-0893de41cc75\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:13 crc kubenswrapper[4713]: I0308 00:34:13.981420 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" Mar 08 00:34:14 crc kubenswrapper[4713]: I0308 00:34:14.201875 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lfj62"] Mar 08 00:34:14 crc kubenswrapper[4713]: I0308 00:34:14.828926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" event={"ID":"6bdaeb5b-32b1-4454-9a68-0893de41cc75","Type":"ContainerStarted","Data":"321dfb024e3ca1c0e5b1d095dd0dfe4e9ba64d4c64cbaa49ee57bd89064c6e6f"} Mar 08 00:34:15 crc kubenswrapper[4713]: I0308 00:34:15.540503 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:15 crc kubenswrapper[4713]: E0308 00:34:15.541004 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:16 crc kubenswrapper[4713]: I0308 00:34:16.846410 4713 generic.go:334] "Generic (PLEG): container finished" podID="cf91b8a6-24ec-4c39-8337-f05acf19e199" containerID="f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a" exitCode=0 Mar 08 00:34:16 crc kubenswrapper[4713]: I0308 00:34:16.846453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerDied","Data":"f5474a515132f9dfb600e5576fc25401132b27f36d91cababcdd4e20fbe4260a"} Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.527733 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.529445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.531494 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.531697 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.532058 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-stbp8" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.532137 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.534400 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.535242 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.549080 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612783 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612964 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.612981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.613012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714620 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714643 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714695 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714711 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.714752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: E0308 00:34:17.714881 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:17 crc kubenswrapper[4713]: E0308 00:34:17.714937 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:18.214914271 +0000 UTC m=+1712.334546514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.720961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.721325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-web-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.722001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-volume\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.722645 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.723504 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.723531 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8166928179f9697bb27271c5054606ad15f49cf71086ec4487477abe8fe5c88e/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.724211 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-tls-assets\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.736068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76d6e5d8-8303-43ac-a477-0dfe579adad2-config-out\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.740770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf2g\" (UniqueName: \"kubernetes.io/projected/76d6e5d8-8303-43ac-a477-0dfe579adad2-kube-api-access-rjf2g\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:17 crc kubenswrapper[4713]: I0308 00:34:17.761246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a49f59-719b-4120-bf1b-b46ee54fb347\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:18 crc kubenswrapper[4713]: I0308 00:34:18.224682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:18 crc kubenswrapper[4713]: E0308 00:34:18.224961 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:18 crc kubenswrapper[4713]: E0308 00:34:18.225017 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:19.22499791 +0000 UTC m=+1713.344630143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:19 crc kubenswrapper[4713]: I0308 00:34:19.240879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:19 crc kubenswrapper[4713]: E0308 00:34:19.241087 4713 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:19 crc kubenswrapper[4713]: E0308 00:34:19.241387 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls podName:76d6e5d8-8303-43ac-a477-0dfe579adad2 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:21.241365905 +0000 UTC m=+1715.360998138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "76d6e5d8-8303-43ac-a477-0dfe579adad2") : secret "default-alertmanager-proxy-tls" not found Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.275408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.289638 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d6e5d8-8303-43ac-a477-0dfe579adad2-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"76d6e5d8-8303-43ac-a477-0dfe579adad2\") " pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.450919 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 08 00:34:21 crc kubenswrapper[4713]: I0308 00:34:21.566648 4713 scope.go:117] "RemoveContainer" containerID="ef6200b05d87f80e3b68b8cd3aa4e78082a7e3103ea753de97cc7213a72cdd71" Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.192223 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.900835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" event={"ID":"6bdaeb5b-32b1-4454-9a68-0893de41cc75","Type":"ContainerStarted","Data":"ab0d8dd635c9519b06a29c0febcdbe31ec56160d22dd909035519284a196f3f3"} Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.903061 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"3eefc93efd42b43d62250985baecbbf57dbaf8879dbc4ec699d874d8bebd51e3"} Mar 08 00:34:23 crc kubenswrapper[4713]: I0308 00:34:23.925652 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-lfj62" podStartSLOduration=2.316121308 podStartE2EDuration="10.925629813s" podCreationTimestamp="2026-03-08 00:34:13 +0000 UTC" firstStartedPulling="2026-03-08 00:34:14.21798119 +0000 UTC m=+1708.337613423" lastFinishedPulling="2026-03-08 00:34:22.827489695 +0000 UTC m=+1716.947121928" observedRunningTime="2026-03-08 00:34:23.9145975 +0000 UTC m=+1718.034229743" watchObservedRunningTime="2026-03-08 00:34:23.925629813 +0000 UTC m=+1718.045262046" Mar 08 00:34:25 crc kubenswrapper[4713]: I0308 00:34:25.917839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b"} Mar 08 00:34:26 crc kubenswrapper[4713]: I0308 00:34:26.548073 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:26 crc kubenswrapper[4713]: E0308 00:34:26.548297 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:27 crc kubenswrapper[4713]: I0308 00:34:27.935904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"bc4fb448f721b6bb976cb2e1f49345a27cd1c296353402161de108ed025f0716"} Mar 08 00:34:28 crc kubenswrapper[4713]: I0308 00:34:28.944224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"3ce63770185f927d536050fcdf86cad8cc018a110fb681959f2e74ddef692d8e"} Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.934084 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.941187 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.942626 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-tzd69" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.942892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.943892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.944197 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 08 00:34:30 crc kubenswrapper[4713]: I0308 00:34:30.944588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007717 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007745 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.007777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.108883 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109004 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109041 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109116 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.109370 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.109453 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls podName:7aaf11cd-f1cf-42c7-9fe9-52880e0af19c nodeName:}" failed. No retries permitted until 2026-03-08 00:34:31.60943345 +0000 UTC m=+1725.729065683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" (UID: "7aaf11cd-f1cf-42c7-9fe9-52880e0af19c") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.109948 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.120600 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.127637 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmgv\" (UniqueName: \"kubernetes.io/projected/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-kube-api-access-6wmgv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: I0308 00:34:31.618478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.618658 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:31 crc kubenswrapper[4713]: E0308 00:34:31.618715 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls podName:7aaf11cd-f1cf-42c7-9fe9-52880e0af19c nodeName:}" failed. No retries permitted until 2026-03-08 00:34:32.618698147 +0000 UTC m=+1726.738330380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" (UID: "7aaf11cd-f1cf-42c7-9fe9-52880e0af19c") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.633421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.639060 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aaf11cd-f1cf-42c7-9fe9-52880e0af19c-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq\" (UID: \"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:32 crc kubenswrapper[4713]: I0308 00:34:32.767779 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.001352 4713 generic.go:334] "Generic (PLEG): container finished" podID="76d6e5d8-8303-43ac-a477-0dfe579adad2" containerID="e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b" exitCode=0 Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.001974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerDied","Data":"e57a6864734bb9e4583b73682f563411af559e6e88938f4da33f38a2c14b661b"} Mar 08 00:34:33 crc kubenswrapper[4713]: W0308 00:34:33.096155 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aaf11cd_f1cf_42c7_9fe9_52880e0af19c.slice/crio-d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2 WatchSource:0}: Error finding container d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2: Status 404 returned error can't find the container with id d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2 Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.103340 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.706495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.707754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.710317 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.710331 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.719819 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756784 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.756847 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.859994 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860073 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860129 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.860160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: E0308 00:34:33.860280 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:33 crc kubenswrapper[4713]: E0308 00:34:33.860332 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls podName:367439a6-a382-49f1-b0af-cf399b5a6401 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:34.360316205 +0000 UTC m=+1728.479948438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" (UID: "367439a6-a382-49f1-b0af-cf399b5a6401") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.863746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/367439a6-a382-49f1-b0af-cf399b5a6401-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.865384 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/367439a6-a382-49f1-b0af-cf399b5a6401-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.867780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:33 crc kubenswrapper[4713]: I0308 00:34:33.889617 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4p4g\" (UniqueName: \"kubernetes.io/projected/367439a6-a382-49f1-b0af-cf399b5a6401-kube-api-access-m4p4g\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:34 crc kubenswrapper[4713]: I0308 00:34:34.012316 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"d0bd4e8ad16493c36a6f58730fea2ea22c194f793f33b51c5aed9d02256dbdb2"} Mar 08 00:34:34 crc kubenswrapper[4713]: I0308 00:34:34.365458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:34 crc kubenswrapper[4713]: E0308 00:34:34.365614 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:34 crc kubenswrapper[4713]: E0308 00:34:34.365662 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls podName:367439a6-a382-49f1-b0af-cf399b5a6401 nodeName:}" failed. No retries permitted until 2026-03-08 00:34:35.365648638 +0000 UTC m=+1729.485280871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" (UID: "367439a6-a382-49f1-b0af-cf399b5a6401") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.380268 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.391554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/367439a6-a382-49f1-b0af-cf399b5a6401-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg\" (UID: \"367439a6-a382-49f1-b0af-cf399b5a6401\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:35 crc kubenswrapper[4713]: I0308 00:34:35.525342 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" Mar 08 00:34:38 crc kubenswrapper[4713]: I0308 00:34:38.547024 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:38 crc kubenswrapper[4713]: E0308 00:34:38.547731 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.073257 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.075370 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.084297 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.084328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.129382 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156868 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.156966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258427 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258529 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.258685 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.258785 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls podName:fff80c8a-de9a-483b-8be3-5ce1423649cb nodeName:}" failed. No retries permitted until 2026-03-08 00:34:39.758764933 +0000 UTC m=+1733.878397166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" (UID: "fff80c8a-de9a-483b-8be3-5ce1423649cb") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.258783 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/fff80c8a-de9a-483b-8be3-5ce1423649cb-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.259632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/fff80c8a-de9a-483b-8be3-5ce1423649cb-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.272209 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.296176 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5vw\" (UniqueName: \"kubernetes.io/projected/fff80c8a-de9a-483b-8be3-5ce1423649cb-kube-api-access-zc5vw\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: I0308 00:34:39.764774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.765046 4713 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:39 crc kubenswrapper[4713]: E0308 00:34:39.765228 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls podName:fff80c8a-de9a-483b-8be3-5ce1423649cb nodeName:}" failed. No retries permitted until 2026-03-08 00:34:40.765209105 +0000 UTC m=+1734.884841338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" (UID: "fff80c8a-de9a-483b-8be3-5ce1423649cb") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.733596 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg"] Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.783315 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.791725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/fff80c8a-de9a-483b-8be3-5ce1423649cb-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl\" (UID: \"fff80c8a-de9a-483b-8be3-5ce1423649cb\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:40 crc kubenswrapper[4713]: W0308 00:34:40.883081 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod367439a6_a382_49f1_b0af_cf399b5a6401.slice/crio-db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc WatchSource:0}: Error finding container db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc: Status 404 returned error can't find the container with id db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc Mar 08 00:34:40 crc kubenswrapper[4713]: I0308 00:34:40.902138 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.113996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"db4af417f078b5f9580d5d86c051ae807c8a1d53ef329af65c71d58c5921c5cc"} Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.146639 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"cf91b8a6-24ec-4c39-8337-f05acf19e199","Type":"ContainerStarted","Data":"de59d2f03d3d2f84d9171aac8cf777a73135b57a458d2b531e6aaed4c253de19"} Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.180351 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.746775209 podStartE2EDuration="39.180327321s" podCreationTimestamp="2026-03-08 00:34:02 +0000 UTC" firstStartedPulling="2026-03-08 00:34:05.822537241 +0000 UTC m=+1699.942169484" lastFinishedPulling="2026-03-08 00:34:40.256089363 +0000 UTC m=+1734.375721596" observedRunningTime="2026-03-08 00:34:41.174753933 +0000 UTC m=+1735.294386176" watchObservedRunningTime="2026-03-08 00:34:41.180327321 +0000 UTC m=+1735.299959564" Mar 08 00:34:41 crc kubenswrapper[4713]: I0308 00:34:41.369170 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl"] Mar 08 00:34:42 crc kubenswrapper[4713]: I0308 00:34:42.153597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"21359b62803c17db8a61e255ac740d8bb95576dae94b515912debfa309c1e4b3"} Mar 08 00:34:42 crc kubenswrapper[4713]: I0308 00:34:42.156206 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"6516bdef4f9306692f82cf58bd85b7ff26eccea5c9321e0980e559bd7036b868"} Mar 08 00:34:45 crc kubenswrapper[4713]: I0308 00:34:45.597794 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.567736 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.569391 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.573184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.573185 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.577314 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.671899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.671973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.672121 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.672214 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.773879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.773954 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.774028 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.774069 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.775097 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/dc460969-e1ae-4bac-8893-7677ac74787b-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.775450 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/dc460969-e1ae-4bac-8893-7677ac74787b-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.786549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/dc460969-e1ae-4bac-8893-7677ac74787b-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.796599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf7l6\" (UniqueName: \"kubernetes.io/projected/dc460969-e1ae-4bac-8893-7677ac74787b-kube-api-access-cf7l6\") pod \"default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4\" (UID: \"dc460969-e1ae-4bac-8893-7677ac74787b\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:46 crc kubenswrapper[4713]: I0308 00:34:46.921027 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" Mar 08 00:34:48 crc kubenswrapper[4713]: I0308 00:34:48.676079 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4"] Mar 08 00:34:48 crc kubenswrapper[4713]: W0308 00:34:48.708604 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc460969_e1ae_4bac_8893_7677ac74787b.slice/crio-9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac WatchSource:0}: Error finding container 9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac: Status 404 returned error can't find the container with id 9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.209486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"49fd9efa0d17e1b0a31476983cd621a2c4da29c35a74e8a29e32b9d478b98ff7"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.212341 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.212384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"fe2077d4048ef8a7f48155c90a44e800aaf420d311535ee1a6f2b4538a01da6e"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.214845 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.214879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"4ea733096fe695d66bbcbe57f75287225c05a01b72fa0f3bd5f0165fa4a545ef"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.221593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.224754 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1"} Mar 08 00:34:49 crc kubenswrapper[4713]: I0308 00:34:49.224779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"9cd0fe6362f15705a632e67df51326523ce64f9f1e230408165c870d42639fac"} Mar 08 00:34:50 crc kubenswrapper[4713]: I0308 00:34:50.597562 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:50 crc kubenswrapper[4713]: I0308 00:34:50.665718 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:51 crc kubenswrapper[4713]: I0308 00:34:51.310753 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.255224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"8a2a5280aad7b8979a719e3c7610932d4cd0af00930dbb8a758731dd94d5aa76"} Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.852653 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.853893 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.856008 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.869161 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.971727 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972215 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:52 crc kubenswrapper[4713]: I0308 00:34:52.972275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074036 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074079 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.074195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.075113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a441502e-5d0a-4ec6-ac3c-df20f292efc8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.075125 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a441502e-5d0a-4ec6-ac3c-df20f292efc8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.080162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/a441502e-5d0a-4ec6-ac3c-df20f292efc8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.097116 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgpc7\" (UniqueName: \"kubernetes.io/projected/a441502e-5d0a-4ec6-ac3c-df20f292efc8-kube-api-access-xgpc7\") pod \"default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5\" (UID: \"a441502e-5d0a-4ec6-ac3c-df20f292efc8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.176471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.541391 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:34:53 crc kubenswrapper[4713]: E0308 00:34:53.541859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:34:53 crc kubenswrapper[4713]: I0308 00:34:53.591070 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5"] Mar 08 00:34:55 crc kubenswrapper[4713]: I0308 00:34:55.302755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"80f4b4014c4dbe1fb437f8e190a30553b02ac77aad1b5eb6b6bee29d7230bb50"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.309880 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.311680 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"12f24fd5ea75d7fac61d291610b51da255f458f9839805c4c526a99564e4c9c0"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.314910 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"7f14074c625eb16dc1fbd098d47e54dd2c4d6db611ea7361baf8ed8a511504bb"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.316758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"76d6e5d8-8303-43ac-a477-0dfe579adad2","Type":"ContainerStarted","Data":"960529310bef2e081705c184e3296095340d6a20de195788598008091f39a7be"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.318704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"896dda969c52affe6df70b3c89ab1673b020960693dd9ea87bff10a44743cc9c"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.320151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"36aa5f3c129ba3c2478b4da8d8b4c638f47ce70147a4454414cb0cd35e050711"} Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.338115 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" podStartSLOduration=3.464379417 podStartE2EDuration="26.338087858s" podCreationTimestamp="2026-03-08 00:34:30 +0000 UTC" firstStartedPulling="2026-03-08 00:34:33.104934623 +0000 UTC m=+1727.224566866" lastFinishedPulling="2026-03-08 00:34:55.978643074 +0000 UTC m=+1750.098275307" observedRunningTime="2026-03-08 00:34:56.333107277 +0000 UTC m=+1750.452739520" watchObservedRunningTime="2026-03-08 00:34:56.338087858 +0000 UTC m=+1750.457720111" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.363645 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=17.472960069 podStartE2EDuration="40.363621031s" podCreationTimestamp="2026-03-08 00:34:16 +0000 UTC" firstStartedPulling="2026-03-08 00:34:33.006505538 +0000 UTC m=+1727.126137771" lastFinishedPulling="2026-03-08 00:34:55.8971665 +0000 UTC m=+1750.016798733" observedRunningTime="2026-03-08 00:34:56.35712873 +0000 UTC m=+1750.476760993" watchObservedRunningTime="2026-03-08 00:34:56.363621031 +0000 UTC m=+1750.483253264" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.383518 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" podStartSLOduration=3.246997386 podStartE2EDuration="10.383492065s" podCreationTimestamp="2026-03-08 00:34:46 +0000 UTC" firstStartedPulling="2026-03-08 00:34:48.735160513 +0000 UTC m=+1742.854792746" lastFinishedPulling="2026-03-08 00:34:55.871655192 +0000 UTC m=+1749.991287425" observedRunningTime="2026-03-08 00:34:56.379979312 +0000 UTC m=+1750.499611555" watchObservedRunningTime="2026-03-08 00:34:56.383492065 +0000 UTC m=+1750.503124298" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.403280 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" podStartSLOduration=8.405990508 podStartE2EDuration="23.403261876s" podCreationTimestamp="2026-03-08 00:34:33 +0000 UTC" firstStartedPulling="2026-03-08 00:34:40.902287576 +0000 UTC m=+1735.021919809" lastFinishedPulling="2026-03-08 00:34:55.899558944 +0000 UTC m=+1750.019191177" observedRunningTime="2026-03-08 00:34:56.399309482 +0000 UTC m=+1750.518941735" watchObservedRunningTime="2026-03-08 00:34:56.403261876 +0000 UTC m=+1750.522894099" Mar 08 00:34:56 crc kubenswrapper[4713]: I0308 00:34:56.423248 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" podStartSLOduration=2.802087534 podStartE2EDuration="17.423222312s" podCreationTimestamp="2026-03-08 00:34:39 +0000 UTC" firstStartedPulling="2026-03-08 00:34:41.386598089 +0000 UTC m=+1735.506230332" lastFinishedPulling="2026-03-08 00:34:56.007732877 +0000 UTC m=+1750.127365110" observedRunningTime="2026-03-08 00:34:56.415214501 +0000 UTC m=+1750.534846724" watchObservedRunningTime="2026-03-08 00:34:56.423222312 +0000 UTC m=+1750.542854575" Mar 08 00:34:57 crc kubenswrapper[4713]: I0308 00:34:57.329190 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"b30b3e644e5c27e518037db47b829436643bf0d2f743aeb7246cb9b7080e84f9"} Mar 08 00:34:57 crc kubenswrapper[4713]: I0308 00:34:57.352983 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" podStartSLOduration=4.819909506 podStartE2EDuration="5.352967192s" podCreationTimestamp="2026-03-08 00:34:52 +0000 UTC" firstStartedPulling="2026-03-08 00:34:55.797343059 +0000 UTC m=+1749.916975292" lastFinishedPulling="2026-03-08 00:34:56.330400745 +0000 UTC m=+1750.450032978" observedRunningTime="2026-03-08 00:34:57.34794109 +0000 UTC m=+1751.467573333" watchObservedRunningTime="2026-03-08 00:34:57.352967192 +0000 UTC m=+1751.472599425" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.089853 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.090388 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" containerID="cri-o://2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" gracePeriod=30 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.357281 4713 generic.go:334] "Generic (PLEG): container finished" podID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerID="2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" exitCode=0 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.357577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerDied","Data":"2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3"} Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.367633 4713 generic.go:334] "Generic (PLEG): container finished" podID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" exitCode=0 Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.367675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerDied","Data":"6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159"} Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.368164 4713 scope.go:117] "RemoveContainer" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.521110 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604110 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604159 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604210 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604360 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.604381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") pod \"52ed2487-d016-4930-a9ec-98500bfc0db3\" (UID: \"52ed2487-d016-4930-a9ec-98500bfc0db3\") " Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.606294 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.613578 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.614341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.616685 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.618007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb" (OuterVolumeSpecName: "kube-api-access-rkmlb") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "kube-api-access-rkmlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.624146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.626154 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "52ed2487-d016-4930-a9ec-98500bfc0db3" (UID: "52ed2487-d016-4930-a9ec-98500bfc0db3"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705874 4713 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705903 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705913 4713 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/52ed2487-d016-4930-a9ec-98500bfc0db3-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705922 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705933 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705943 4713 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/52ed2487-d016-4930-a9ec-98500bfc0db3-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:00 crc kubenswrapper[4713]: I0308 00:35:00.705953 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkmlb\" (UniqueName: \"kubernetes.io/projected/52ed2487-d016-4930-a9ec-98500bfc0db3-kube-api-access-rkmlb\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.154862 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:01 crc kubenswrapper[4713]: E0308 00:35:01.156103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.156203 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.156400 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" containerName="default-interconnect" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.157039 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.161748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212787 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212852 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212896 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212929 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.212949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.213052 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.213131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314596 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314655 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314716 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314738 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314759 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314786 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.314805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.316214 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-config\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.319535 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.320602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.339046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-sasl-users\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.344669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw56d\" (UniqueName: \"kubernetes.io/projected/a45b0eb2-8f38-42e0-8c0a-98a6f453263a-kube-api-access-tw56d\") pod \"default-interconnect-68864d46cb-qpwg6\" (UID: \"a45b0eb2-8f38-42e0-8c0a-98a6f453263a\") " pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.377362 4713 generic.go:334] "Generic (PLEG): container finished" podID="fff80c8a-de9a-483b-8be3-5ce1423649cb" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.377663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerDied","Data":"58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.378478 4713 scope.go:117] "RemoveContainer" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.381400 4713 generic.go:334] "Generic (PLEG): container finished" podID="367439a6-a382-49f1-b0af-cf399b5a6401" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.381476 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerDied","Data":"70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.382994 4713 scope.go:117] "RemoveContainer" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.388630 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391446 4713 generic.go:334] "Generic (PLEG): container finished" podID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerDied","Data":"b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.391920 4713 scope.go:117] "RemoveContainer" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.394929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" event={"ID":"52ed2487-d016-4930-a9ec-98500bfc0db3","Type":"ContainerDied","Data":"71917d86375943e31a9292ae7412991594bcc498f11ed7d30ee0bdc265d89c06"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.395051 4713 scope.go:117] "RemoveContainer" containerID="2ee28c2f8fc1433f9ba19f9c07ab8d85929756f524dd0e86526ecf528ab6aea3" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.394957 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-t7lzv" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403024 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc460969-e1ae-4bac-8893-7677ac74787b" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" exitCode=0 Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403077 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerDied","Data":"821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1"} Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.403596 4713 scope.go:117] "RemoveContainer" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.472569 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.552426 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:01 crc kubenswrapper[4713]: I0308 00:35:01.561262 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-t7lzv"] Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.022513 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-qpwg6"] Mar 08 00:35:02 crc kubenswrapper[4713]: W0308 00:35:02.025406 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45b0eb2_8f38_42e0_8c0a_98a6f453263a.slice/crio-7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81 WatchSource:0}: Error finding container 7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81: Status 404 returned error can't find the container with id 7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81 Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.413507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.415603 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" event={"ID":"a45b0eb2-8f38-42e0-8c0a-98a6f453263a","Type":"ContainerStarted","Data":"aa0612733b05d7027a684d1a0b180dae0fe589898783b419564190e9bedaa400"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.415723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" event={"ID":"a45b0eb2-8f38-42e0-8c0a-98a6f453263a","Type":"ContainerStarted","Data":"7b32fe186e015d884280c892d44054af0c19013053cb8cb20930cf37119a0a81"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.418730 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422010 4713 generic.go:334] "Generic (PLEG): container finished" podID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" exitCode=0 Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerDied","Data":"578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422290 4713 scope.go:117] "RemoveContainer" containerID="6b95db28ad5a98065c5e450da4389e996cc18574d525c6fb99d295022c4eb159" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.422510 4713 scope.go:117] "RemoveContainer" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" Mar 08 00:35:02 crc kubenswrapper[4713]: E0308 00:35:02.422685 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_service-telemetry(a441502e-5d0a-4ec6-ac3c-df20f292efc8)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" podUID="a441502e-5d0a-4ec6-ac3c-df20f292efc8" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.424789 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.442425 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363"} Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.559232 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ed2487-d016-4930-a9ec-98500bfc0db3" path="/var/lib/kubelet/pods/52ed2487-d016-4930-a9ec-98500bfc0db3/volumes" Mar 08 00:35:02 crc kubenswrapper[4713]: I0308 00:35:02.573020 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-qpwg6" podStartSLOduration=2.57300065 podStartE2EDuration="2.57300065s" podCreationTimestamp="2026-03-08 00:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:35:02.535351387 +0000 UTC m=+1756.654983620" watchObservedRunningTime="2026-03-08 00:35:02.57300065 +0000 UTC m=+1756.692632883" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451597 4713 generic.go:334] "Generic (PLEG): container finished" podID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerDied","Data":"f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.451922 4713 scope.go:117] "RemoveContainer" containerID="b2f3a5a9db7bcda7e3be2eea7306d4663f2317fbad21fd29ba1b163bf6d167cd" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.452484 4713 scope.go:117] "RemoveContainer" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.452838 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_service-telemetry(7aaf11cd-f1cf-42c7-9fe9-52880e0af19c)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" podUID="7aaf11cd-f1cf-42c7-9fe9-52880e0af19c" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454440 4713 generic.go:334] "Generic (PLEG): container finished" podID="dc460969-e1ae-4bac-8893-7677ac74787b" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerDied","Data":"4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.454748 4713 scope.go:117] "RemoveContainer" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.454930 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_service-telemetry(dc460969-e1ae-4bac-8893-7677ac74787b)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" podUID="dc460969-e1ae-4bac-8893-7677ac74787b" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.457302 4713 generic.go:334] "Generic (PLEG): container finished" podID="fff80c8a-de9a-483b-8be3-5ce1423649cb" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.457369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerDied","Data":"e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.458245 4713 scope.go:117] "RemoveContainer" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.459159 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_service-telemetry(fff80c8a-de9a-483b-8be3-5ce1423649cb)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" podUID="fff80c8a-de9a-483b-8be3-5ce1423649cb" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.459696 4713 generic.go:334] "Generic (PLEG): container finished" podID="367439a6-a382-49f1-b0af-cf399b5a6401" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" exitCode=0 Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.459725 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerDied","Data":"bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba"} Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.460230 4713 scope.go:117] "RemoveContainer" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" Mar 08 00:35:03 crc kubenswrapper[4713]: E0308 00:35:03.460478 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_service-telemetry(367439a6-a382-49f1-b0af-cf399b5a6401)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" podUID="367439a6-a382-49f1-b0af-cf399b5a6401" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.494584 4713 scope.go:117] "RemoveContainer" containerID="821c5b1c62c2dcde14a74894cdc9009068a9627d2a8c835bc11af48ec9ec9fa1" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.536945 4713 scope.go:117] "RemoveContainer" containerID="58a27c65a5ae34c4e07d3676ebb3c90914314f1fcea054bbe537214aa2b27e54" Mar 08 00:35:03 crc kubenswrapper[4713]: I0308 00:35:03.591104 4713 scope.go:117] "RemoveContainer" containerID="70e8c69b8363d7dda6445bab94851d9634cebc6b36fa398befdc00186319c707" Mar 08 00:35:04 crc kubenswrapper[4713]: I0308 00:35:04.541997 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:04 crc kubenswrapper[4713]: E0308 00:35:04.542230 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.145855 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.147384 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.149273 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.149663 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.157249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235591 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235713 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.235777 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336670 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336767 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.336807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.337509 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/1a97222f-e496-4378-bc3d-6a508f559df7-qdr-test-config\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.349539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/1a97222f-e496-4378-bc3d-6a508f559df7-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.353031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djj99\" (UniqueName: \"kubernetes.io/projected/1a97222f-e496-4378-bc3d-6a508f559df7-kube-api-access-djj99\") pod \"qdr-test\" (UID: \"1a97222f-e496-4378-bc3d-6a508f559df7\") " pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.501430 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 08 00:35:09 crc kubenswrapper[4713]: I0308 00:35:09.930609 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 08 00:35:09 crc kubenswrapper[4713]: W0308 00:35:09.933134 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a97222f_e496_4378_bc3d_6a508f559df7.slice/crio-377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad WatchSource:0}: Error finding container 377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad: Status 404 returned error can't find the container with id 377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad Mar 08 00:35:10 crc kubenswrapper[4713]: I0308 00:35:10.538808 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"1a97222f-e496-4378-bc3d-6a508f559df7","Type":"ContainerStarted","Data":"377b51021a15378245afe1499568ff5fb06b904615911739881ab5364cd49aad"} Mar 08 00:35:14 crc kubenswrapper[4713]: I0308 00:35:14.541549 4713 scope.go:117] "RemoveContainer" containerID="bc221c0389f357e012f607860356b103e99a5311c97bbd49bf5c2b82612f9fba" Mar 08 00:35:14 crc kubenswrapper[4713]: I0308 00:35:14.542212 4713 scope.go:117] "RemoveContainer" containerID="f22d15b01ac342c9a988dd24cb96db243b978fd6684d586ecc3d821e60a23c8a" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.540650 4713 scope.go:117] "RemoveContainer" containerID="4e8228dd7e1505ae76be6137d6aa04f351c5c66796994d3f4fccc27926d99363" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.541873 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:15 crc kubenswrapper[4713]: E0308 00:35:15.542124 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:15 crc kubenswrapper[4713]: I0308 00:35:15.543079 4713 scope.go:117] "RemoveContainer" containerID="e8e35cf9fa960b38dca238f7c8b96ed5b552a38770c5bb83f929694a8f1480f1" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.540635 4713 scope.go:117] "RemoveContainer" containerID="578dd7fe1589e58e1d385d60d8db2edd769342686802c8b4da7a5cf54a0120fd" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.601656 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg" event={"ID":"367439a6-a382-49f1-b0af-cf399b5a6401","Type":"ContainerStarted","Data":"3f26b3b7d0aaecb8ab2d98362882634e444eeec6faa861014c5228723d9f98df"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.603963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq" event={"ID":"7aaf11cd-f1cf-42c7-9fe9-52880e0af19c","Type":"ContainerStarted","Data":"7ca4705ce545532bdbd07579c75173fdf9ca0a5b0116e20b9720c2874be3cbe3"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.606292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4" event={"ID":"dc460969-e1ae-4bac-8893-7677ac74787b","Type":"ContainerStarted","Data":"8f1f71ea5ff75a5db89e966feb3b61b330a9abdfe33b187f6554a8fad390f789"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.607738 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"1a97222f-e496-4378-bc3d-6a508f559df7","Type":"ContainerStarted","Data":"5b96f2c1fffdca7302dfd1a0a9361137e4e1bec2600143e51fa5ed758a317b03"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.609670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl" event={"ID":"fff80c8a-de9a-483b-8be3-5ce1423649cb","Type":"ContainerStarted","Data":"56cfac6e554d14a6ac665137ad59c976d1246f7973cb11f290ac3c5b0d556219"} Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.678279 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.556626901 podStartE2EDuration="8.678256207s" podCreationTimestamp="2026-03-08 00:35:09 +0000 UTC" firstStartedPulling="2026-03-08 00:35:09.934691815 +0000 UTC m=+1764.054324048" lastFinishedPulling="2026-03-08 00:35:17.056321121 +0000 UTC m=+1771.175953354" observedRunningTime="2026-03-08 00:35:17.666079505 +0000 UTC m=+1771.785711738" watchObservedRunningTime="2026-03-08 00:35:17.678256207 +0000 UTC m=+1771.797888440" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.916783 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.918277 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.920367 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.920703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.921151 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.921878 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.922731 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.922754 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.931432 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975070 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975179 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975207 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975235 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:17 crc kubenswrapper[4713]: I0308 00:35:17.975251 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075880 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075925 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.075976 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.076047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.077699 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.077972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.078041 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.078851 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.079332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.079713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.099122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"stf-smoketest-smoke1-f4r52\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.236552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.258592 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.259639 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.279038 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.287170 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.385864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.421615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"curl\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.491911 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-f4r52"] Mar 08 00:35:18 crc kubenswrapper[4713]: W0308 00:35:18.500522 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cbd55a3_d3b0_4c65_8b16_a7a9e2a8c033.slice/crio-2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d WatchSource:0}: Error finding container 2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d: Status 404 returned error can't find the container with id 2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.618062 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5" event={"ID":"a441502e-5d0a-4ec6-ac3c-df20f292efc8","Type":"ContainerStarted","Data":"ba441f81d307aa312c0fb8a5c4a6e74842c602a014943d0f4bb593e72e761a72"} Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.619202 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d"} Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.625135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:18 crc kubenswrapper[4713]: I0308 00:35:18.845832 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 08 00:35:18 crc kubenswrapper[4713]: W0308 00:35:18.847437 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf866ca9_19cc_4b26_96ae_370b911a5776.slice/crio-7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc WatchSource:0}: Error finding container 7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc: Status 404 returned error can't find the container with id 7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc Mar 08 00:35:19 crc kubenswrapper[4713]: I0308 00:35:19.628935 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerStarted","Data":"7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc"} Mar 08 00:35:22 crc kubenswrapper[4713]: I0308 00:35:22.761719 4713 scope.go:117] "RemoveContainer" containerID="01d9b7b88d08637099f2699ad9a25e90c9327b764008cf2cde4f1f7e06061451" Mar 08 00:35:27 crc kubenswrapper[4713]: I0308 00:35:27.541098 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:27 crc kubenswrapper[4713]: E0308 00:35:27.541712 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:31 crc kubenswrapper[4713]: I0308 00:35:31.458704 4713 scope.go:117] "RemoveContainer" containerID="0fd1776a90badc7eb6f79de68dfeed110b30a49d06c2f0b0856f0e37b49744ef" Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.733936 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerID="63fa6058ccb9e9e70ae07366fd458652f6fd38278fd206e536070f9b9ac066d9" exitCode=0 Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.733992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerDied","Data":"63fa6058ccb9e9e70ae07366fd458652f6fd38278fd206e536070f9b9ac066d9"} Mar 08 00:35:33 crc kubenswrapper[4713]: I0308 00:35:33.735734 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.270087 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.407335 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_bf866ca9-19cc-4b26-96ae-370b911a5776/curl/0.log" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.407771 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") pod \"bf866ca9-19cc-4b26-96ae-370b911a5776\" (UID: \"bf866ca9-19cc-4b26-96ae-370b911a5776\") " Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.414191 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98" (OuterVolumeSpecName: "kube-api-access-gxl98") pod "bf866ca9-19cc-4b26-96ae-370b911a5776" (UID: "bf866ca9-19cc-4b26-96ae-370b911a5776"). InnerVolumeSpecName "kube-api-access-gxl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.510421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxl98\" (UniqueName: \"kubernetes.io/projected/bf866ca9-19cc-4b26-96ae-370b911a5776-kube-api-access-gxl98\") on node \"crc\" DevicePath \"\"" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.644107 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.771893 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerStarted","Data":"817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773332 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"bf866ca9-19cc-4b26-96ae-370b911a5776","Type":"ContainerDied","Data":"7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc"} Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.773862 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5349214c265556ea82d7cb5b315709df49c2e88391b8ab89d9c9bc858b9ccc" Mar 08 00:35:38 crc kubenswrapper[4713]: I0308 00:35:38.789763 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-f4r52" podStartSLOduration=1.675689631 podStartE2EDuration="21.789747418s" podCreationTimestamp="2026-03-08 00:35:17 +0000 UTC" firstStartedPulling="2026-03-08 00:35:18.502928596 +0000 UTC m=+1772.622560829" lastFinishedPulling="2026-03-08 00:35:38.616986383 +0000 UTC m=+1792.736618616" observedRunningTime="2026-03-08 00:35:38.788171736 +0000 UTC m=+1792.907803969" watchObservedRunningTime="2026-03-08 00:35:38.789747418 +0000 UTC m=+1792.909379641" Mar 08 00:35:39 crc kubenswrapper[4713]: I0308 00:35:39.542352 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:39 crc kubenswrapper[4713]: E0308 00:35:39.542962 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:35:52 crc kubenswrapper[4713]: I0308 00:35:52.541302 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:35:52 crc kubenswrapper[4713]: E0308 00:35:52.542007 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.135681 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: E0308 00:36:00.136524 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.136536 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.136668 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf866ca9-19cc-4b26-96ae-370b911a5776" containerName="curl" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.137125 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139576 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139588 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.139886 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.145640 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.218129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.319302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.337753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"auto-csr-approver-29548836-wg7kn\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.462939 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.693960 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:36:00 crc kubenswrapper[4713]: I0308 00:36:00.936882 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerStarted","Data":"d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271"} Mar 08 00:36:02 crc kubenswrapper[4713]: I0308 00:36:02.952040 4713 generic.go:334] "Generic (PLEG): container finished" podID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerID="d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6" exitCode=0 Mar 08 00:36:02 crc kubenswrapper[4713]: I0308 00:36:02.952144 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerDied","Data":"d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6"} Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.237723 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.290295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") pod \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\" (UID: \"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c\") " Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.299721 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6" (OuterVolumeSpecName: "kube-api-access-5wrp6") pod "90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" (UID: "90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c"). InnerVolumeSpecName "kube-api-access-5wrp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.392253 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wrp6\" (UniqueName: \"kubernetes.io/projected/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c-kube-api-access-5wrp6\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971262 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" event={"ID":"90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c","Type":"ContainerDied","Data":"d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271"} Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971309 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64b7ce18222926abe2f8743d437dc7c186b8b27d33d8d52c3a94e8de8c80271" Mar 08 00:36:04 crc kubenswrapper[4713]: I0308 00:36:04.971770 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548836-wg7kn" Mar 08 00:36:05 crc kubenswrapper[4713]: I0308 00:36:05.302888 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:36:05 crc kubenswrapper[4713]: I0308 00:36:05.308556 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548830-csc8c"] Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.546845 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:06 crc kubenswrapper[4713]: E0308 00:36:06.547314 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.555026 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b849b06-281c-44be-a061-ca5b3905b3e1" path="/var/lib/kubelet/pods/2b849b06-281c-44be-a061-ca5b3905b3e1/volumes" Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985037 4713 generic.go:334] "Generic (PLEG): container finished" podID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerID="aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8" exitCode=1 Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8"} Mar 08 00:36:06 crc kubenswrapper[4713]: I0308 00:36:06.985703 4713 scope.go:117] "RemoveContainer" containerID="aa007f126728fe60cfc4386205caadd75377d0ec9c951c570e3220deeb63fde8" Mar 08 00:36:08 crc kubenswrapper[4713]: I0308 00:36:08.759196 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:36:11 crc kubenswrapper[4713]: I0308 00:36:11.017142 4713 generic.go:334] "Generic (PLEG): container finished" podID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerID="817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6" exitCode=0 Mar 08 00:36:11 crc kubenswrapper[4713]: I0308 00:36:11.017191 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"817ec7435e70836cc77b4b156e719fe269db6da434be0baf352b225d4eaf98d6"} Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.361243 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408225 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408353 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408505 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.408559 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") pod \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\" (UID: \"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033\") " Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.417145 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv" (OuterVolumeSpecName: "kube-api-access-mqhnv") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "kube-api-access-mqhnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.426947 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.428206 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.431810 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.433085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.444723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.445771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" (UID: "7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510549 4713 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510589 4713 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510600 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510611 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510620 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqhnv\" (UniqueName: \"kubernetes.io/projected/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-kube-api-access-mqhnv\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510628 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:12 crc kubenswrapper[4713]: I0308 00:36:12.510636 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.038846 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-f4r52" event={"ID":"7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033","Type":"ContainerDied","Data":"2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d"} Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.039184 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2274211d96b4a09a6384930c81dd8eccdd8ac1f49e3bd4a3703759b4adf3ef1d" Mar 08 00:36:13 crc kubenswrapper[4713]: I0308 00:36:13.038897 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-f4r52" Mar 08 00:36:17 crc kubenswrapper[4713]: I0308 00:36:17.540670 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:17 crc kubenswrapper[4713]: E0308 00:36:17.542120 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.033367 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035137 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035301 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035450 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035519 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: E0308 00:36:20.035594 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035658 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035891 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-collectd" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.035980 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" containerName="oc" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.036057 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033" containerName="smoketest-ceilometer" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.036993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.039848 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.040890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041046 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041164 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.041924 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.042287 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.049387 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122734 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122815 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.122927 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.123035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224407 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224505 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224602 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.224627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225570 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.225892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.226002 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.226074 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.266992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"stf-smoketest-smoke1-mljxj\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.365996 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:20 crc kubenswrapper[4713]: I0308 00:36:20.867223 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mljxj"] Mar 08 00:36:21 crc kubenswrapper[4713]: I0308 00:36:21.105765 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e"} Mar 08 00:36:21 crc kubenswrapper[4713]: I0308 00:36:21.106209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec"} Mar 08 00:36:22 crc kubenswrapper[4713]: I0308 00:36:22.114357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerStarted","Data":"c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d"} Mar 08 00:36:22 crc kubenswrapper[4713]: I0308 00:36:22.134973 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-mljxj" podStartSLOduration=2.134954552 podStartE2EDuration="2.134954552s" podCreationTimestamp="2026-03-08 00:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 00:36:22.133123683 +0000 UTC m=+1836.252755916" watchObservedRunningTime="2026-03-08 00:36:22.134954552 +0000 UTC m=+1836.254586805" Mar 08 00:36:31 crc kubenswrapper[4713]: I0308 00:36:31.541606 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:31 crc kubenswrapper[4713]: E0308 00:36:31.542408 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:33 crc kubenswrapper[4713]: I0308 00:36:33.110226 4713 scope.go:117] "RemoveContainer" containerID="5ffd3bb6cf22ba954a7e67226be2ca668fd3bb44939915e41b40c3c5cd452879" Mar 08 00:36:46 crc kubenswrapper[4713]: I0308 00:36:46.546926 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:46 crc kubenswrapper[4713]: E0308 00:36:46.548134 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.364383 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerID="c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d" exitCode=0 Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.364483 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d"} Mar 08 00:36:53 crc kubenswrapper[4713]: I0308 00:36:53.366052 4713 scope.go:117] "RemoveContainer" containerID="c0d736c7faa8e82f5582112e3ecb064a9dd96bdd593e760f19d2d40ad9c9415d" Mar 08 00:36:55 crc kubenswrapper[4713]: I0308 00:36:55.383479 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerID="08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e" exitCode=0 Mar 08 00:36:55 crc kubenswrapper[4713]: I0308 00:36:55.383579 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"08db7e7c9c20f3e009a6071feb78142a592f0dfd53b676e63efbb7af03c2e14e"} Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.640739 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771235 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771333 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771446 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771482 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.771537 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") pod \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\" (UID: \"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395\") " Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.784926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4" (OuterVolumeSpecName: "kube-api-access-bl4j4") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "kube-api-access-bl4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.791993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.793315 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.797459 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.799017 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.806240 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.808701 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" (UID: "c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873612 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873657 4713 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873676 4713 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873688 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873702 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-kube-api-access-bl4j4\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873714 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:56 crc kubenswrapper[4713]: I0308 00:36:56.873729 4713 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mljxj" event={"ID":"c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395","Type":"ContainerDied","Data":"d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec"} Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400494 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03c5bbc7c3a11f8d2df67f7991c4c8325f0d5644a757298cca3f8f3922e00ec" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.400536 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mljxj" Mar 08 00:36:57 crc kubenswrapper[4713]: I0308 00:36:57.541009 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:36:57 crc kubenswrapper[4713]: E0308 00:36:57.541306 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:36:58 crc kubenswrapper[4713]: I0308 00:36:58.519403 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-f4r52_7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033/smoketest-collectd/0.log" Mar 08 00:36:58 crc kubenswrapper[4713]: I0308 00:36:58.791062 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-f4r52_7cbd55a3-d3b0-4c65-8b16-a7a9e2a8c033/smoketest-ceilometer/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.023662 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-qpwg6_a45b0eb2-8f38-42e0-8c0a-98a6f453263a/default-interconnect/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.333491 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_7aaf11cd-f1cf-42c7-9fe9-52880e0af19c/bridge/2.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.568537 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-5l6gq_7aaf11cd-f1cf-42c7-9fe9-52880e0af19c/sg-core/0.log" Mar 08 00:36:59 crc kubenswrapper[4713]: I0308 00:36:59.872823 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_dc460969-e1ae-4bac-8893-7677ac74787b/bridge/2.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.113454 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6c46f97cd8-l6kx4_dc460969-e1ae-4bac-8893-7677ac74787b/sg-core/0.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.371610 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_367439a6-a382-49f1-b0af-cf399b5a6401/bridge/2.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.602674 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-jrtvg_367439a6-a382-49f1-b0af-cf399b5a6401/sg-core/0.log" Mar 08 00:37:00 crc kubenswrapper[4713]: I0308 00:37:00.829977 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_a441502e-5d0a-4ec6-ac3c-df20f292efc8/bridge/2.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.093508 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6b7f55fd97-nxld5_a441502e-5d0a-4ec6-ac3c-df20f292efc8/sg-core/0.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.337282 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_fff80c8a-de9a-483b-8be3-5ce1423649cb/bridge/2.log" Mar 08 00:37:01 crc kubenswrapper[4713]: I0308 00:37:01.601433 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-g87tl_fff80c8a-de9a-483b-8be3-5ce1423649cb/sg-core/0.log" Mar 08 00:37:04 crc kubenswrapper[4713]: I0308 00:37:04.626116 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-795859486c-d7k9q_934a7934-e52f-4279-9c2a-4255daf78d5a/operator/0.log" Mar 08 00:37:04 crc kubenswrapper[4713]: I0308 00:37:04.846310 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_cf91b8a6-24ec-4c39-8337-f05acf19e199/prometheus/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.085476 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_c8a16625-a3a9-4404-bf4a-073fc8f621b9/elasticsearch/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.317299 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lfj62_6bdaeb5b-32b1-4454-9a68-0893de41cc75/prometheus-webhook-snmp/0.log" Mar 08 00:37:05 crc kubenswrapper[4713]: I0308 00:37:05.572745 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_76d6e5d8-8303-43ac-a477-0dfe579adad2/alertmanager/0.log" Mar 08 00:37:10 crc kubenswrapper[4713]: I0308 00:37:10.541770 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:10 crc kubenswrapper[4713]: E0308 00:37:10.542502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:19 crc kubenswrapper[4713]: I0308 00:37:19.858873 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-6f9dc9fb4b-dzbm4_c714eef0-0fe5-4836-80e1-c640aa9527e7/operator/0.log" Mar 08 00:37:22 crc kubenswrapper[4713]: I0308 00:37:22.941695 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-795859486c-d7k9q_934a7934-e52f-4279-9c2a-4255daf78d5a/operator/0.log" Mar 08 00:37:23 crc kubenswrapper[4713]: I0308 00:37:23.204018 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_1a97222f-e496-4378-bc3d-6a508f559df7/qdr/0.log" Mar 08 00:37:24 crc kubenswrapper[4713]: I0308 00:37:24.542644 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:24 crc kubenswrapper[4713]: E0308 00:37:24.542956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.182540 4713 scope.go:117] "RemoveContainer" containerID="fd8002808c5d3f13b3b01cadcdced7f1edb530c711896d763103800ccc5d24e3" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.209975 4713 scope.go:117] "RemoveContainer" containerID="8a4854cb64f8a7f1201a66c3c0908bf11c24711a500d6939eec4e2631a9a94e6" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.256137 4713 scope.go:117] "RemoveContainer" containerID="c082051221894646965936ec6155e8aca998188d9e68b92365d5716b581ebfa0" Mar 08 00:37:33 crc kubenswrapper[4713]: I0308 00:37:33.287494 4713 scope.go:117] "RemoveContainer" containerID="a21bd8ee1ac8242c094817b3835b31572654184e63adec117111a47c5246ee20" Mar 08 00:37:38 crc kubenswrapper[4713]: I0308 00:37:38.542156 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:38 crc kubenswrapper[4713]: E0308 00:37:38.543096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.415697 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:46 crc kubenswrapper[4713]: E0308 00:37:46.416512 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416526 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: E0308 00:37:46.416565 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416574 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416691 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-ceilometer" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.416707 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fcf46e-b1e4-47b1-ad9c-dd4dbd45c395" containerName="smoketest-collectd" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.417289 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.431793 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.516186 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.618012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.640565 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"infrawatch-operators-wbrks\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.733146 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:46 crc kubenswrapper[4713]: I0308 00:37:46.956588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.784097 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerStarted","Data":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.784403 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerStarted","Data":"dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694"} Mar 08 00:37:47 crc kubenswrapper[4713]: I0308 00:37:47.803406 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-wbrks" podStartSLOduration=1.696018105 podStartE2EDuration="1.803391576s" podCreationTimestamp="2026-03-08 00:37:46 +0000 UTC" firstStartedPulling="2026-03-08 00:37:46.958076502 +0000 UTC m=+1921.077708735" lastFinishedPulling="2026-03-08 00:37:47.065449963 +0000 UTC m=+1921.185082206" observedRunningTime="2026-03-08 00:37:47.802995836 +0000 UTC m=+1921.922628069" watchObservedRunningTime="2026-03-08 00:37:47.803391576 +0000 UTC m=+1921.923023809" Mar 08 00:37:51 crc kubenswrapper[4713]: I0308 00:37:51.541393 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:37:51 crc kubenswrapper[4713]: E0308 00:37:51.541867 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.733676 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.734266 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.765959 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.831182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.832222 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.839758 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cz8lx"/"openshift-service-ca.crt" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.840941 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cz8lx"/"kube-root-ca.crt" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.856906 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.910223 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.975909 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:56 crc kubenswrapper[4713]: I0308 00:37:56.976129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.039939 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.077866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.095044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"must-gather-6ljft\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.151138 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.410698 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:37:57 crc kubenswrapper[4713]: I0308 00:37:57.856104 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"979e6681fbe6dac3ab631be1c477b1d0c3781d5561bf4ce433f7f61ccc039d85"} Mar 08 00:37:58 crc kubenswrapper[4713]: I0308 00:37:58.865797 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-wbrks" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" containerID="cri-o://f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" gracePeriod=2 Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.277526 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.339092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") pod \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\" (UID: \"2c1b190e-aa10-4da9-a6a5-2f15cb53e693\") " Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.345273 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr" (OuterVolumeSpecName: "kube-api-access-v6vtr") pod "2c1b190e-aa10-4da9-a6a5-2f15cb53e693" (UID: "2c1b190e-aa10-4da9-a6a5-2f15cb53e693"). InnerVolumeSpecName "kube-api-access-v6vtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.440614 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6vtr\" (UniqueName: \"kubernetes.io/projected/2c1b190e-aa10-4da9-a6a5-2f15cb53e693-kube-api-access-v6vtr\") on node \"crc\" DevicePath \"\"" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874192 4713 generic.go:334] "Generic (PLEG): container finished" podID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" exitCode=0 Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874233 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerDied","Data":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874260 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-wbrks" event={"ID":"2c1b190e-aa10-4da9-a6a5-2f15cb53e693","Type":"ContainerDied","Data":"dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694"} Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874277 4713 scope.go:117] "RemoveContainer" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.874370 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-wbrks" Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.910113 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:37:59 crc kubenswrapper[4713]: I0308 00:37:59.918461 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-wbrks"] Mar 08 00:38:00 crc kubenswrapper[4713]: E0308 00:38:00.022634 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1b190e_aa10_4da9_a6a5_2f15cb53e693.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1b190e_aa10_4da9_a6a5_2f15cb53e693.slice/crio-dea16eae43b99de0d35d2efa59657d5e58e913b4d65320c433c0e7ecae5c5694\": RecentStats: unable to find data in memory cache]" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146109 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:00 crc kubenswrapper[4713]: E0308 00:38:00.146459 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146482 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.146630 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" containerName="registry-server" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.147194 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149656 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149677 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.149755 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.160074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.251597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.353111 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.371320 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"auto-csr-approver-29548838-2zhvk\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.466869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:00 crc kubenswrapper[4713]: I0308 00:38:00.549446 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1b190e-aa10-4da9-a6a5-2f15cb53e693" path="/var/lib/kubelet/pods/2c1b190e-aa10-4da9-a6a5-2f15cb53e693/volumes" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.399726 4713 scope.go:117] "RemoveContainer" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:38:04 crc kubenswrapper[4713]: E0308 00:38:04.400387 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": container with ID starting with f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328 not found: ID does not exist" containerID="f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.400419 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328"} err="failed to get container status \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": rpc error: code = NotFound desc = could not find container \"f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328\": container with ID starting with f9ffc2d60bb0d0e5df705841daa95640bd01899e3a4977a890d516b178613328 not found: ID does not exist" Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.866303 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548838-2zhvk"] Mar 08 00:38:04 crc kubenswrapper[4713]: W0308 00:38:04.869358 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca9577aa_e929_4bd9_8056_a85221917ebc.slice/crio-999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999 WatchSource:0}: Error finding container 999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999: Status 404 returned error can't find the container with id 999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999 Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.872328 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.920402 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerStarted","Data":"999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.921835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.921868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerStarted","Data":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} Mar 08 00:38:04 crc kubenswrapper[4713]: I0308 00:38:04.937239 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cz8lx/must-gather-6ljft" podStartSLOduration=1.881564291 podStartE2EDuration="8.93722374s" podCreationTimestamp="2026-03-08 00:37:56 +0000 UTC" firstStartedPulling="2026-03-08 00:37:57.42628097 +0000 UTC m=+1931.545913203" lastFinishedPulling="2026-03-08 00:38:04.481940419 +0000 UTC m=+1938.601572652" observedRunningTime="2026-03-08 00:38:04.934677042 +0000 UTC m=+1939.054309285" watchObservedRunningTime="2026-03-08 00:38:04.93722374 +0000 UTC m=+1939.056855973" Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.546729 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:06 crc kubenswrapper[4713]: E0308 00:38:06.547413 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.942589 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerID="73f07223497890b1ddebc93a4d6e16e91e4882539ad2021b501bdc3d3d15f480" exitCode=0 Mar 08 00:38:06 crc kubenswrapper[4713]: I0308 00:38:06.942742 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerDied","Data":"73f07223497890b1ddebc93a4d6e16e91e4882539ad2021b501bdc3d3d15f480"} Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.219881 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.379151 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") pod \"ca9577aa-e929-4bd9-8056-a85221917ebc\" (UID: \"ca9577aa-e929-4bd9-8056-a85221917ebc\") " Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.385745 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh" (OuterVolumeSpecName: "kube-api-access-d57vh") pod "ca9577aa-e929-4bd9-8056-a85221917ebc" (UID: "ca9577aa-e929-4bd9-8056-a85221917ebc"). InnerVolumeSpecName "kube-api-access-d57vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.481146 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d57vh\" (UniqueName: \"kubernetes.io/projected/ca9577aa-e929-4bd9-8056-a85221917ebc-kube-api-access-d57vh\") on node \"crc\" DevicePath \"\"" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.963952 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" event={"ID":"ca9577aa-e929-4bd9-8056-a85221917ebc","Type":"ContainerDied","Data":"999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999"} Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.964259 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="999f97d2516761c56787c8df22d07ba071ba1de8bc1d85be82b04c8ec0507999" Mar 08 00:38:08 crc kubenswrapper[4713]: I0308 00:38:08.964308 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548838-2zhvk" Mar 08 00:38:09 crc kubenswrapper[4713]: I0308 00:38:09.273434 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:38:09 crc kubenswrapper[4713]: I0308 00:38:09.278465 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548832-6k4lz"] Mar 08 00:38:10 crc kubenswrapper[4713]: I0308 00:38:10.549242 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a13b2b-064d-4323-8d5c-d86f76405f38" path="/var/lib/kubelet/pods/d0a13b2b-064d-4323-8d5c-d86f76405f38/volumes" Mar 08 00:38:17 crc kubenswrapper[4713]: I0308 00:38:17.541211 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:17 crc kubenswrapper[4713]: E0308 00:38:17.542001 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:30 crc kubenswrapper[4713]: I0308 00:38:30.541122 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:30 crc kubenswrapper[4713]: E0308 00:38:30.542106 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4kr8v_openshift-machine-config-operator(5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76)\"" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" Mar 08 00:38:33 crc kubenswrapper[4713]: I0308 00:38:33.358972 4713 scope.go:117] "RemoveContainer" containerID="d06ee3cd17ca3058dd1d41ca8e61fbdf1a5ff7196264bb612799359dc20d5255" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.197406 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7wd77_f878574f-5b4a-4a3f-9b2b-e8eeb569f0fc/control-plane-machine-set-operator/0.log" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.360043 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dkkh7_c6893b56-2395-4f91-9349-c23b48b957c8/machine-api-operator/0.log" Mar 08 00:38:42 crc kubenswrapper[4713]: I0308 00:38:42.410539 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dkkh7_c6893b56-2395-4f91-9349-c23b48b957c8/kube-rbac-proxy/0.log" Mar 08 00:38:44 crc kubenswrapper[4713]: I0308 00:38:44.541880 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:38:45 crc kubenswrapper[4713]: I0308 00:38:45.220747 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.142317 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-gkqzr_d4f51ae9-d2ab-4704-aeeb-5710aceda4f0/cert-manager-controller/0.log" Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.269113 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-9mcfp_1a191145-c818-4e84-8bf3-91145fe9db03/cert-manager-cainjector/0.log" Mar 08 00:38:53 crc kubenswrapper[4713]: I0308 00:38:53.338448 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-qmcpl_2a071bf2-22e7-40f7-976a-74f79abbbd78/cert-manager-webhook/0.log" Mar 08 00:39:06 crc kubenswrapper[4713]: I0308 00:39:06.963413 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-4z5hw_1f48c701-2464-42f6-b2d7-c851ae965f1b/prometheus-operator/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.107151 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_860dc604-80d3-4d4b-8b1e-8a430b706882/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.120414 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_e2152c14-6da7-4f74-a30e-da9e4e7c1acc/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.295520 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v4h4x_f559f6d0-89dc-4d38-807f-491671408dc7/operator/0.log" Mar 08 00:39:07 crc kubenswrapper[4713]: I0308 00:39:07.359542 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tw72p_3d1a0596-7485-4376-9630-688753a7abd7/perses-operator/0.log" Mar 08 00:39:20 crc kubenswrapper[4713]: I0308 00:39:20.939163 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.105630 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.107082 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.147202 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.304582 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.313042 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.333192 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fjwjpt_54dbca74-9530-4327-8ede-124dc50096cf/extract/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.514332 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.610445 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.649634 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.668405 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.800608 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/extract/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.802219 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/pull/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.828350 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egwx2p_82947b22-2505-49f0-94e0-039a1a219656/util/0.log" Mar 08 00:39:21 crc kubenswrapper[4713]: I0308 00:39:21.960944 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.086129 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.132018 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.133130 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.307519 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/extract/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.333273 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.337871 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t8vlw_f5cca55d-5b29-4aa4-a88c-c15c3c9d0cc2/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.493332 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.658963 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.664372 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.664465 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.828802 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/util/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.853249 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/pull/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.860517 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085v25p_9a95188d-5e62-49d4-851d-08195ed98f4d/extract/0.log" Mar 08 00:39:22 crc kubenswrapper[4713]: I0308 00:39:22.979017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.138891 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.181646 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.181688 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.301626 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.317372 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.554392 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.654621 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mn4rt_ce49dca5-e07d-416e-a72d-281928ff343b/registry-server/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.708218 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.789253 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.789264 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.943432 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-content/0.log" Mar 08 00:39:23 crc kubenswrapper[4713]: I0308 00:39:23.963070 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.147532 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bm59_26e0cfc6-458c-4be3-b57c-1cd5fad657c4/marketplace-operator/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.261413 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.405917 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rc7p9_dd52d225-2e7e-4958-98fc-52028b545353/registry-server/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.455185 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.485778 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.491017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.612857 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-content/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.686439 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/extract-utilities/0.log" Mar 08 00:39:24 crc kubenswrapper[4713]: I0308 00:39:24.935993 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4b75j_47027c84-0848-4140-bed0-b04f627cf6da/registry-server/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.299187 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-6qhb5_860dc604-80d3-4d4b-8b1e-8a430b706882/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.314073 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-4z5hw_1f48c701-2464-42f6-b2d7-c851ae965f1b/prometheus-operator/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.408489 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77db5b85fd-xr8kk_e2152c14-6da7-4f74-a30e-da9e4e7c1acc/prometheus-operator-admission-webhook/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.474480 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v4h4x_f559f6d0-89dc-4d38-807f-491671408dc7/operator/0.log" Mar 08 00:39:37 crc kubenswrapper[4713]: I0308 00:39:37.505524 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tw72p_3d1a0596-7485-4376-9630-688753a7abd7/perses-operator/0.log" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.147722 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: E0308 00:40:00.148500 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.148513 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.148650 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9577aa-e929-4bd9-8056-a85221917ebc" containerName="oc" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.149101 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.151812 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.151907 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.152272 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.162763 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.210611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.312562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.330241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"auto-csr-approver-29548840-6mm4q\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.470855 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.688171 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548840-6mm4q"] Mar 08 00:40:00 crc kubenswrapper[4713]: I0308 00:40:00.762936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerStarted","Data":"d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5"} Mar 08 00:40:02 crc kubenswrapper[4713]: I0308 00:40:02.775992 4713 generic.go:334] "Generic (PLEG): container finished" podID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerID="1b4785721e3b1cd2a3224d4a2879be9724cfc7ed1cc394cbcb9be86d2951adad" exitCode=0 Mar 08 00:40:02 crc kubenswrapper[4713]: I0308 00:40:02.776053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerDied","Data":"1b4785721e3b1cd2a3224d4a2879be9724cfc7ed1cc394cbcb9be86d2951adad"} Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.050948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.170922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") pod \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\" (UID: \"c19c555a-8190-4c25-97c4-3d6b74b4fd7f\") " Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.175760 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth" (OuterVolumeSpecName: "kube-api-access-bpfth") pod "c19c555a-8190-4c25-97c4-3d6b74b4fd7f" (UID: "c19c555a-8190-4c25-97c4-3d6b74b4fd7f"). InnerVolumeSpecName "kube-api-access-bpfth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.272706 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfth\" (UniqueName: \"kubernetes.io/projected/c19c555a-8190-4c25-97c4-3d6b74b4fd7f-kube-api-access-bpfth\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" event={"ID":"c19c555a-8190-4c25-97c4-3d6b74b4fd7f","Type":"ContainerDied","Data":"d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5"} Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796839 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6759f46bf459f442b5321a8633c93e49983cb7b9bb403c768ed87a665fbbea5" Mar 08 00:40:04 crc kubenswrapper[4713]: I0308 00:40:04.796869 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548840-6mm4q" Mar 08 00:40:05 crc kubenswrapper[4713]: I0308 00:40:05.117429 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:40:05 crc kubenswrapper[4713]: I0308 00:40:05.123284 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548834-njxhh"] Mar 08 00:40:06 crc kubenswrapper[4713]: I0308 00:40:06.551098 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef90820d-fdcc-4ff1-97db-756e8c96851a" path="/var/lib/kubelet/pods/ef90820d-fdcc-4ff1-97db-756e8c96851a/volumes" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.568734 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:16 crc kubenswrapper[4713]: E0308 00:40:16.569441 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.569455 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.569612 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19c555a-8190-4c25-97c4-3d6b74b4fd7f" containerName="oc" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.570642 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.570735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.729449 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.730586 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.730657 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831023 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.831182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.832042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.832325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.848669 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"redhat-operators-bcvn4\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:16 crc kubenswrapper[4713]: I0308 00:40:16.886422 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.385767 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.916977 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" exitCode=0 Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.917030 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26"} Mar 08 00:40:17 crc kubenswrapper[4713]: I0308 00:40:17.917060 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerStarted","Data":"7c2423739e13a478b468599d3a2d1d9b6ec7b44cb5c5a51aec50a5c426275fb3"} Mar 08 00:40:19 crc kubenswrapper[4713]: I0308 00:40:19.935376 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" exitCode=0 Mar 08 00:40:19 crc kubenswrapper[4713]: I0308 00:40:19.935486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb"} Mar 08 00:40:20 crc kubenswrapper[4713]: I0308 00:40:20.947811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerStarted","Data":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} Mar 08 00:40:20 crc kubenswrapper[4713]: I0308 00:40:20.984577 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bcvn4" podStartSLOduration=2.425717648 podStartE2EDuration="4.984539948s" podCreationTimestamp="2026-03-08 00:40:16 +0000 UTC" firstStartedPulling="2026-03-08 00:40:17.918851328 +0000 UTC m=+2072.038483561" lastFinishedPulling="2026-03-08 00:40:20.477673598 +0000 UTC m=+2074.597305861" observedRunningTime="2026-03-08 00:40:20.973501317 +0000 UTC m=+2075.093133580" watchObservedRunningTime="2026-03-08 00:40:20.984539948 +0000 UTC m=+2075.104172191" Mar 08 00:40:26 crc kubenswrapper[4713]: I0308 00:40:26.887086 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:26 crc kubenswrapper[4713]: I0308 00:40:26.887590 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:27 crc kubenswrapper[4713]: I0308 00:40:27.943447 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bcvn4" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" probeResult="failure" output=< Mar 08 00:40:27 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 08 00:40:27 crc kubenswrapper[4713]: > Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.082476 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" exitCode=0 Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.082594 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cz8lx/must-gather-6ljft" event={"ID":"0ea30b1a-51f4-4455-b6eb-d382b491da53","Type":"ContainerDied","Data":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.084519 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:32 crc kubenswrapper[4713]: I0308 00:40:32.557620 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/gather/0.log" Mar 08 00:40:33 crc kubenswrapper[4713]: I0308 00:40:33.448654 4713 scope.go:117] "RemoveContainer" containerID="54d98c92ae122fbfe885e4ff1e76b36a0e389e6c7ef0c5d932a7c247396198f3" Mar 08 00:40:36 crc kubenswrapper[4713]: I0308 00:40:36.948051 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:36 crc kubenswrapper[4713]: I0308 00:40:36.997103 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:37 crc kubenswrapper[4713]: I0308 00:40:37.177147 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.131639 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bcvn4" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" containerID="cri-o://7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" gracePeriod=2 Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.480262 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.512055 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.519860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2" (OuterVolumeSpecName: "kube-api-access-ld2m2") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "kube-api-access-ld2m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613062 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") pod \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\" (UID: \"b6fd257c-a12b-4c64-b2d2-8f89db2abb10\") " Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.613535 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld2m2\" (UniqueName: \"kubernetes.io/projected/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-kube-api-access-ld2m2\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.614411 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities" (OuterVolumeSpecName: "utilities") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.716022 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.739542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6fd257c-a12b-4c64-b2d2-8f89db2abb10" (UID: "b6fd257c-a12b-4c64-b2d2-8f89db2abb10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:38 crc kubenswrapper[4713]: I0308 00:40:38.817354 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6fd257c-a12b-4c64-b2d2-8f89db2abb10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141091 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" exitCode=0 Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141155 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcvn4" event={"ID":"b6fd257c-a12b-4c64-b2d2-8f89db2abb10","Type":"ContainerDied","Data":"7c2423739e13a478b468599d3a2d1d9b6ec7b44cb5c5a51aec50a5c426275fb3"} Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141209 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcvn4" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.141795 4713 scope.go:117] "RemoveContainer" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.169847 4713 scope.go:117] "RemoveContainer" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.173359 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.182336 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bcvn4"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.201621 4713 scope.go:117] "RemoveContainer" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.237659 4713 scope.go:117] "RemoveContainer" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.238373 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": container with ID starting with 7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52 not found: ID does not exist" containerID="7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.238423 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52"} err="failed to get container status \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": rpc error: code = NotFound desc = could not find container \"7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52\": container with ID starting with 7064f3ab38a1f16525e2bf4e489d38dced71290796194e703d3a80fe5a226c52 not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.238452 4713 scope.go:117] "RemoveContainer" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.238983 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": container with ID starting with c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb not found: ID does not exist" containerID="c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239096 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb"} err="failed to get container status \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": rpc error: code = NotFound desc = could not find container \"c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb\": container with ID starting with c4cccf21a02b5efcddf8b7e62df2239f5b861f141dd7722f8969b75b52f2a5cb not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239207 4713 scope.go:117] "RemoveContainer" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: E0308 00:40:39.239609 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": container with ID starting with 12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26 not found: ID does not exist" containerID="12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.239635 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26"} err="failed to get container status \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": rpc error: code = NotFound desc = could not find container \"12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26\": container with ID starting with 12b2b31aa0bf208e9c198b4b2d9ae3b9a56439807ce203d6513de8aac0554d26 not found: ID does not exist" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.323683 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.323940 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cz8lx/must-gather-6ljft" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" containerID="cri-o://2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" gracePeriod=2 Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.342283 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cz8lx/must-gather-6ljft"] Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.691907 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/copy/0.log" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.692945 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.729793 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") pod \"0ea30b1a-51f4-4455-b6eb-d382b491da53\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.730093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") pod \"0ea30b1a-51f4-4455-b6eb-d382b491da53\" (UID: \"0ea30b1a-51f4-4455-b6eb-d382b491da53\") " Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.736946 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78" (OuterVolumeSpecName: "kube-api-access-qjx78") pod "0ea30b1a-51f4-4455-b6eb-d382b491da53" (UID: "0ea30b1a-51f4-4455-b6eb-d382b491da53"). InnerVolumeSpecName "kube-api-access-qjx78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.800393 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0ea30b1a-51f4-4455-b6eb-d382b491da53" (UID: "0ea30b1a-51f4-4455-b6eb-d382b491da53"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.832415 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjx78\" (UniqueName: \"kubernetes.io/projected/0ea30b1a-51f4-4455-b6eb-d382b491da53-kube-api-access-qjx78\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:39 crc kubenswrapper[4713]: I0308 00:40:39.832468 4713 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ea30b1a-51f4-4455-b6eb-d382b491da53-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.152688 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cz8lx_must-gather-6ljft_0ea30b1a-51f4-4455-b6eb-d382b491da53/copy/0.log" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153183 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" exitCode=143 Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153266 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cz8lx/must-gather-6ljft" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.153265 4713 scope.go:117] "RemoveContainer" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.170487 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.226621 4713 scope.go:117] "RemoveContainer" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: E0308 00:40:40.228133 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": container with ID starting with 2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7 not found: ID does not exist" containerID="2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228219 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7"} err="failed to get container status \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": rpc error: code = NotFound desc = could not find container \"2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7\": container with ID starting with 2e2ae6565c1f19e938373e0180e909754fc32d1e75c3b491b94738a45e6b61d7 not found: ID does not exist" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228266 4713 scope.go:117] "RemoveContainer" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: E0308 00:40:40.228733 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": container with ID starting with 5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7 not found: ID does not exist" containerID="5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.228770 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7"} err="failed to get container status \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": rpc error: code = NotFound desc = could not find container \"5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7\": container with ID starting with 5d4ba5e09c1289057ca2875f3df44ed349eb2cc42c3d61ea35480f88ee82bfc7 not found: ID does not exist" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.549155 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" path="/var/lib/kubelet/pods/0ea30b1a-51f4-4455-b6eb-d382b491da53/volumes" Mar 08 00:40:40 crc kubenswrapper[4713]: I0308 00:40:40.549790 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" path="/var/lib/kubelet/pods/b6fd257c-a12b-4c64-b2d2-8f89db2abb10/volumes" Mar 08 00:41:04 crc kubenswrapper[4713]: I0308 00:41:04.501378 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:41:04 crc kubenswrapper[4713]: I0308 00:41:04.501765 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.466852 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468198 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-utilities" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468257 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-utilities" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468296 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-content" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468318 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="extract-content" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468348 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468365 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468400 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: E0308 00:41:10.468446 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468462 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468781 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="copy" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468819 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6fd257c-a12b-4c64-b2d2-8f89db2abb10" containerName="registry-server" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.468931 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea30b1a-51f4-4455-b6eb-d382b491da53" containerName="gather" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.471190 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.481319 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612483 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.612562 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713736 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.713764 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.714289 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.714325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.733729 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"certified-operators-x94gq\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:10 crc kubenswrapper[4713]: I0308 00:41:10.816034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.088611 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.397836 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" exitCode=0 Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.397874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a"} Mar 08 00:41:11 crc kubenswrapper[4713]: I0308 00:41:11.398201 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"9d1813c737de7699c02f3dc563346797ca3fce4b1d9b6cb161c92b45f6f598f5"} Mar 08 00:41:12 crc kubenswrapper[4713]: I0308 00:41:12.411947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} Mar 08 00:41:13 crc kubenswrapper[4713]: I0308 00:41:13.424140 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" exitCode=0 Mar 08 00:41:13 crc kubenswrapper[4713]: I0308 00:41:13.424203 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} Mar 08 00:41:14 crc kubenswrapper[4713]: I0308 00:41:14.434323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerStarted","Data":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} Mar 08 00:41:14 crc kubenswrapper[4713]: I0308 00:41:14.454056 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x94gq" podStartSLOduration=1.793535927 podStartE2EDuration="4.454035981s" podCreationTimestamp="2026-03-08 00:41:10 +0000 UTC" firstStartedPulling="2026-03-08 00:41:11.398956829 +0000 UTC m=+2125.518589062" lastFinishedPulling="2026-03-08 00:41:14.059456873 +0000 UTC m=+2128.179089116" observedRunningTime="2026-03-08 00:41:14.451611687 +0000 UTC m=+2128.571243950" watchObservedRunningTime="2026-03-08 00:41:14.454035981 +0000 UTC m=+2128.573668224" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.816357 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.817347 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:20 crc kubenswrapper[4713]: I0308 00:41:20.885495 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:21 crc kubenswrapper[4713]: I0308 00:41:21.529147 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:21 crc kubenswrapper[4713]: I0308 00:41:21.570791 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.506327 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x94gq" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" containerID="cri-o://bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" gracePeriod=2 Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.853369 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924426 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924505 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.924617 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") pod \"8e79129c-88cb-499d-9181-37edfb346e17\" (UID: \"8e79129c-88cb-499d-9181-37edfb346e17\") " Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.926937 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities" (OuterVolumeSpecName: "utilities") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.931200 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg" (OuterVolumeSpecName: "kube-api-access-792pg") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "kube-api-access-792pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:41:23 crc kubenswrapper[4713]: I0308 00:41:23.994316 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e79129c-88cb-499d-9181-37edfb346e17" (UID: "8e79129c-88cb-499d-9181-37edfb346e17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026024 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-792pg\" (UniqueName: \"kubernetes.io/projected/8e79129c-88cb-499d-9181-37edfb346e17-kube-api-access-792pg\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026073 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.026092 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e79129c-88cb-499d-9181-37edfb346e17-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528394 4713 generic.go:334] "Generic (PLEG): container finished" podID="8e79129c-88cb-499d-9181-37edfb346e17" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" exitCode=0 Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528490 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94gq" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.528516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.529926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94gq" event={"ID":"8e79129c-88cb-499d-9181-37edfb346e17","Type":"ContainerDied","Data":"9d1813c737de7699c02f3dc563346797ca3fce4b1d9b6cb161c92b45f6f598f5"} Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.529970 4713 scope.go:117] "RemoveContainer" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.563496 4713 scope.go:117] "RemoveContainer" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.582700 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.589524 4713 scope.go:117] "RemoveContainer" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.589663 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x94gq"] Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.612862 4713 scope.go:117] "RemoveContainer" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.613305 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": container with ID starting with bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef not found: ID does not exist" containerID="bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.613411 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef"} err="failed to get container status \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": rpc error: code = NotFound desc = could not find container \"bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef\": container with ID starting with bc30f956115d63b514d5e0f8d546ae9543b9089b03858d7fb6c7d2309983f0ef not found: ID does not exist" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.613513 4713 scope.go:117] "RemoveContainer" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.614047 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": container with ID starting with 7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09 not found: ID does not exist" containerID="7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614088 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09"} err="failed to get container status \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": rpc error: code = NotFound desc = could not find container \"7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09\": container with ID starting with 7e411cc044053b9669360dfca8db3729354b1136fd362fe8aae4ac11646c1a09 not found: ID does not exist" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614117 4713 scope.go:117] "RemoveContainer" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: E0308 00:41:24.614460 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": container with ID starting with 8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a not found: ID does not exist" containerID="8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a" Mar 08 00:41:24 crc kubenswrapper[4713]: I0308 00:41:24.614500 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a"} err="failed to get container status \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": rpc error: code = NotFound desc = could not find container \"8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a\": container with ID starting with 8957bf0863a64a7840f9e7ef3e4e990f4e86bfa6eaa480276ba2b25ea26a5a7a not found: ID does not exist" Mar 08 00:41:26 crc kubenswrapper[4713]: I0308 00:41:26.559010 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e79129c-88cb-499d-9181-37edfb346e17" path="/var/lib/kubelet/pods/8e79129c-88cb-499d-9181-37edfb346e17/volumes" Mar 08 00:41:34 crc kubenswrapper[4713]: I0308 00:41:34.501152 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:41:34 crc kubenswrapper[4713]: I0308 00:41:34.501731 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.148355 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149528 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149552 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149599 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-content" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149616 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-content" Mar 08 00:42:00 crc kubenswrapper[4713]: E0308 00:42:00.149639 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-utilities" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149656 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="extract-utilities" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.149990 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e79129c-88cb-499d-9181-37edfb346e17" containerName="registry-server" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.150731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.153493 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.153639 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jf28t" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.154899 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.155234 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.169997 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.271022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.292905 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"auto-csr-approver-29548842-nbbhm\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.479562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.718131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29548842-nbbhm"] Mar 08 00:42:00 crc kubenswrapper[4713]: W0308 00:42:00.722349 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9759dcd8_b056_4924_9c1f_96ae6cdd2341.slice/crio-0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a WatchSource:0}: Error finding container 0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a: Status 404 returned error can't find the container with id 0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a Mar 08 00:42:00 crc kubenswrapper[4713]: I0308 00:42:00.851512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerStarted","Data":"0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a"} Mar 08 00:42:01 crc kubenswrapper[4713]: I0308 00:42:01.859092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerStarted","Data":"024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb"} Mar 08 00:42:01 crc kubenswrapper[4713]: I0308 00:42:01.874421 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" podStartSLOduration=1.08364329 podStartE2EDuration="1.87440073s" podCreationTimestamp="2026-03-08 00:42:00 +0000 UTC" firstStartedPulling="2026-03-08 00:42:00.724726062 +0000 UTC m=+2174.844358295" lastFinishedPulling="2026-03-08 00:42:01.515483482 +0000 UTC m=+2175.635115735" observedRunningTime="2026-03-08 00:42:01.87214159 +0000 UTC m=+2175.991773823" watchObservedRunningTime="2026-03-08 00:42:01.87440073 +0000 UTC m=+2175.994032963" Mar 08 00:42:02 crc kubenswrapper[4713]: I0308 00:42:02.870951 4713 generic.go:334] "Generic (PLEG): container finished" podID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerID="024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb" exitCode=0 Mar 08 00:42:02 crc kubenswrapper[4713]: I0308 00:42:02.870997 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerDied","Data":"024756081a650cbe1c2fb8388c3bda8daa8b3d8054f170bbd33a09ee491a07fb"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.095027 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.227499 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") pod \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\" (UID: \"9759dcd8-b056-4924-9c1f-96ae6cdd2341\") " Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.232469 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl" (OuterVolumeSpecName: "kube-api-access-4j5jl") pod "9759dcd8-b056-4924-9c1f-96ae6cdd2341" (UID: "9759dcd8-b056-4924-9c1f-96ae6cdd2341"). InnerVolumeSpecName "kube-api-access-4j5jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.329018 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j5jl\" (UniqueName: \"kubernetes.io/projected/9759dcd8-b056-4924-9c1f-96ae6cdd2341-kube-api-access-4j5jl\") on node \"crc\" DevicePath \"\"" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.501811 4713 patch_prober.go:28] interesting pod/machine-config-daemon-4kr8v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.501966 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502592 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.502649 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" podUID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerName="machine-config-daemon" containerID="cri-o://80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd" gracePeriod=600 Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" event={"ID":"9759dcd8-b056-4924-9c1f-96ae6cdd2341","Type":"ContainerDied","Data":"0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888917 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7addbd9e3693867799956c5dee06a018d8f34bc11e6e13bdce5aaa4358aa8a" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.888944 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29548842-nbbhm" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895346 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76" containerID="80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd" exitCode=0 Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895383 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerDied","Data":"80ca810d4dadcdf454d6a3193c471ad78a80c943fa65c9d882400f00b80252cd"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4kr8v" event={"ID":"5a8b6fc3-0146-4dcf-9f6a-a0147b3abe76","Type":"ContainerStarted","Data":"8a96ab182dae708701b9a232e6e12194ed79a11f4ec0534022482994ad49659e"} Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.895426 4713 scope.go:117] "RemoveContainer" containerID="013dba1182b90525090925e8a60b6ad33882dff27cbd48a5ca854189f5202e5b" Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.977671 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:42:04 crc kubenswrapper[4713]: I0308 00:42:04.983923 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29548836-wg7kn"] Mar 08 00:42:06 crc kubenswrapper[4713]: I0308 00:42:06.552525 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c" path="/var/lib/kubelet/pods/90776cde-8ddb-4c2c-a622-f6d2a9f7bd7c/volumes" Mar 08 00:42:33 crc kubenswrapper[4713]: I0308 00:42:33.589806 4713 scope.go:117] "RemoveContainer" containerID="d6d99e02f6a45a057a86ce43be270637fd870f48d563905dc65b832b4165b2d6" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.744088 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:08 crc kubenswrapper[4713]: E0308 00:43:08.745033 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745052 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745215 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9759dcd8-b056-4924-9c1f-96ae6cdd2341" containerName="oc" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.745720 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.754539 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.879962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:08 crc kubenswrapper[4713]: I0308 00:43:08.981478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.003933 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"infrawatch-operators-7nqh7\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.078739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.507468 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 00:43:09 crc kubenswrapper[4713]: I0308 00:43:09.513178 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.400350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerStarted","Data":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.400653 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerStarted","Data":"1af2e63be9b7f5c6d70bf485c960db17189bcafb21ed6287d90b04c635002095"} Mar 08 00:43:10 crc kubenswrapper[4713]: I0308 00:43:10.418860 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-7nqh7" podStartSLOduration=2.324716779 podStartE2EDuration="2.418841532s" podCreationTimestamp="2026-03-08 00:43:08 +0000 UTC" firstStartedPulling="2026-03-08 00:43:09.507221643 +0000 UTC m=+2243.626853876" lastFinishedPulling="2026-03-08 00:43:09.601346396 +0000 UTC m=+2243.720978629" observedRunningTime="2026-03-08 00:43:10.41194236 +0000 UTC m=+2244.531574603" watchObservedRunningTime="2026-03-08 00:43:10.418841532 +0000 UTC m=+2244.538473775" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.159086 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.160717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.174962 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356222 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.356509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.457990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458035 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.458810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.459401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.476473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"community-operators-sqqcq\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.485794 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:12 crc kubenswrapper[4713]: I0308 00:43:12.971218 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432118 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25" exitCode=0 Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432185 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25"} Mar 08 00:43:13 crc kubenswrapper[4713]: I0308 00:43:13.432230 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerStarted","Data":"5db300c60241204ebec94f5b2c1edb7c6a67193a9280f3448324bb8f49146b49"} Mar 08 00:43:15 crc kubenswrapper[4713]: E0308 00:43:15.070038 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79fef27_446e_4c6b_be4d_2b2885fa81bf.slice/crio-46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79fef27_446e_4c6b_be4d_2b2885fa81bf.slice/crio-conmon-46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015.scope\": RecentStats: unable to find data in memory cache]" Mar 08 00:43:15 crc kubenswrapper[4713]: I0308 00:43:15.448968 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015" exitCode=0 Mar 08 00:43:15 crc kubenswrapper[4713]: I0308 00:43:15.449176 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015"} Mar 08 00:43:16 crc kubenswrapper[4713]: I0308 00:43:16.457762 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerStarted","Data":"896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82"} Mar 08 00:43:16 crc kubenswrapper[4713]: I0308 00:43:16.481014 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqqcq" podStartSLOduration=2.028286595 podStartE2EDuration="4.480996767s" podCreationTimestamp="2026-03-08 00:43:12 +0000 UTC" firstStartedPulling="2026-03-08 00:43:13.434986055 +0000 UTC m=+2247.554618288" lastFinishedPulling="2026-03-08 00:43:15.887696227 +0000 UTC m=+2250.007328460" observedRunningTime="2026-03-08 00:43:16.476513568 +0000 UTC m=+2250.596145801" watchObservedRunningTime="2026-03-08 00:43:16.480996767 +0000 UTC m=+2250.600629000" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.079412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.080144 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.113055 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:19 crc kubenswrapper[4713]: I0308 00:43:19.505485 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.347142 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.486537 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.486591 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.501371 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-7nqh7" podUID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" containerName="registry-server" containerID="cri-o://208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" gracePeriod=2 Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.564133 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.619706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.892475 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.910856 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") pod \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\" (UID: \"2e4ce6f4-6278-444b-baf1-fc8bd41857e9\") " Mar 08 00:43:22 crc kubenswrapper[4713]: I0308 00:43:22.921138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r" (OuterVolumeSpecName: "kube-api-access-5bk4r") pod "2e4ce6f4-6278-444b-baf1-fc8bd41857e9" (UID: "2e4ce6f4-6278-444b-baf1-fc8bd41857e9"). InnerVolumeSpecName "kube-api-access-5bk4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.011873 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bk4r\" (UniqueName: \"kubernetes.io/projected/2e4ce6f4-6278-444b-baf1-fc8bd41857e9-kube-api-access-5bk4r\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509149 4713 generic.go:334] "Generic (PLEG): container finished" podID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" exitCode=0 Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509218 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7nqh7" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509269 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerDied","Data":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7nqh7" event={"ID":"2e4ce6f4-6278-444b-baf1-fc8bd41857e9","Type":"ContainerDied","Data":"1af2e63be9b7f5c6d70bf485c960db17189bcafb21ed6287d90b04c635002095"} Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.509370 4713 scope.go:117] "RemoveContainer" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.539408 4713 scope.go:117] "RemoveContainer" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: E0308 00:43:23.540026 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": container with ID starting with 208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c not found: ID does not exist" containerID="208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.540076 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c"} err="failed to get container status \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": rpc error: code = NotFound desc = could not find container \"208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c\": container with ID starting with 208523ff66310434f5ef6408aab896ded74957af1155c6b05453d12c8f461a5c not found: ID does not exist" Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.545771 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:23 crc kubenswrapper[4713]: I0308 00:43:23.551016 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-7nqh7"] Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.553922 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4ce6f4-6278-444b-baf1-fc8bd41857e9" path="/var/lib/kubelet/pods/2e4ce6f4-6278-444b-baf1-fc8bd41857e9/volumes" Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.939290 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:24 crc kubenswrapper[4713]: I0308 00:43:24.940279 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqqcq" podUID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerName="registry-server" containerID="cri-o://896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" gracePeriod=2 Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.538868 4713 generic.go:334] "Generic (PLEG): container finished" podID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" containerID="896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" exitCode=0 Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.538896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82"} Mar 08 00:43:25 crc kubenswrapper[4713]: I0308 00:43:25.904206 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.052814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") pod \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\" (UID: \"c79fef27-446e-4c6b-be4d-2b2885fa81bf\") " Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.054404 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities" (OuterVolumeSpecName: "utilities") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.059376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98" (OuterVolumeSpecName: "kube-api-access-4fj98") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "kube-api-access-4fj98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.114586 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c79fef27-446e-4c6b-be4d-2b2885fa81bf" (UID: "c79fef27-446e-4c6b-be4d-2b2885fa81bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.154727 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.155293 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fj98\" (UniqueName: \"kubernetes.io/projected/c79fef27-446e-4c6b-be4d-2b2885fa81bf-kube-api-access-4fj98\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.155328 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79fef27-446e-4c6b-be4d-2b2885fa81bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.552852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqcq" event={"ID":"c79fef27-446e-4c6b-be4d-2b2885fa81bf","Type":"ContainerDied","Data":"5db300c60241204ebec94f5b2c1edb7c6a67193a9280f3448324bb8f49146b49"} Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.553195 4713 scope.go:117] "RemoveContainer" containerID="896c89cf785293da14c6a8cd3407a5fa68db55175e00a4a426be4579167c2e82" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.553063 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqcq" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.578103 4713 scope.go:117] "RemoveContainer" containerID="46a462f14df217e09ab6a50424fcf2f946e8455c93c30a669f66527eca90e015" Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.588140 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.594884 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqqcq"] Mar 08 00:43:26 crc kubenswrapper[4713]: I0308 00:43:26.608753 4713 scope.go:117] "RemoveContainer" containerID="24da6baeea23297ba20a20552bd23b182bbc70f7d5a967124726fe2c78469b25" Mar 08 00:43:28 crc kubenswrapper[4713]: I0308 00:43:28.549811 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79fef27-446e-4c6b-be4d-2b2885fa81bf" path="/var/lib/kubelet/pods/c79fef27-446e-4c6b-be4d-2b2885fa81bf/volumes"